0 ratings0% found this document useful (0 votes) 107 views27 pagesAssignment Data Mining
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here.
Available Formats
Download as PDF or read online on Scribd
DATA MINING
ASSIGNMENT 0
QL) Which of the following software is used to store and manage data in large enterprises:
A) Database Management System
8) Computer Networks
C} Operating System
D) Compiler
2) The most common form of Database Management System is:
A) Transitional
8) Rotational
€) Relational
D) Translational
3) Data Mining is valuable to an enterprise for:
AA) Finding patterns and trends
8) Security
) Communication
D) Human Resource
Q4) Data Mining is also commonly known as:
A) Web Searching
8) Data Scrapping
C) Data Transfer
D) Knowledge Discovery in DatabaseQ5) On what kind of data can data mining algorithms be applicable:
A) Video
8) Text
C)image
D) All of the above
Q6) Which of the following applications involve classification?
A) Identifying Similar Households based on Electricity Consumption
8) Predicting Price of Rice Next Year
C) Identifying Supermarket Items Commonly Bought Together
D) Detecting Fraud Users of a Credit Card
Q7) Which of the following applications involve clustering?
A) Identifying Similar Households based on Electricity Consumption
8) Predicting Price of Rice Next Year
C) Identifying Supermarket Items Commonly Bought Together
D} Detecting Fraud Users of a Credit Card
Q8) For which of the following tasks can we use Regression?
A) Identifying Similar Households Based on Electricity Consumption
B) Predicting Price of Rice Next Year
C) Identifying Supermarket Items Commonly Bought Together
D) Detecting Fraud Users of a Credit Card
3) For which of the following we can use Association Rules?
A) Identifying Similar Households Based on Electricity Consumption
8) Predicting Price of Rice Next Year
ig Supermarket Items Commonly Bought Together
Fraud Users of a Credit CardQ10) Data Mining may involve analysis of:
A) large Volume of Data
8) Heterogeneous Data
Cj Dynamic Data
D) All of the AboveData Mining
Assignment 1
All Questions are of 1 mark.
1. Which of the following is usually the last step in the data mining process?
a) Visualization
b) Preprocessing
©) Modelling
4) Deployment
Ans: d
Explanation: The last step in the data mining process is to deploy the models to a production,
environment. Deployment is important because it makes the models available to users for use.
2. Sales database of items in a supermarket can be considered as an example of:
a) Record data
b) Ordered data
©) Graph data
4} None of the above
Ans: a
Explanation: The most basic form of record data has no explicit relationship among records or
data fields, and every record (object) has the same set of attributes. Record data is usually
stored either in flat files or in relational databases.
3. HTML links are an example of:
a) Record data
b) Ordered data
©) Graph data
4) None of the above
Ans: €
Explanation: HTML links are an example of graph data,
4. Name of a place, can be considered an attribute of type?
a) Nominal
b). Ordinal
©) Interval
4d) Ratio
Ans:a
Explanation: Nominal-related to names. The values of a Nominal attribute are name of things,
some kind of symbols. There is no order (rank, position) among values of nominal attribute.
5. Astore sells 10 items. Maximum possible number of candidate 3-itemsets is
a) 120
b) 6
9 15d) 56
Ans:a
Explanation: Number of ways of choosing 3 items from 10 items is 10C3 = 120
6. Ifa record data matrix has reduced number of columns after @ transformation, the
transformation has performed:
a). Data Sampling
b)_ Dimensionality Reduction
©) Noise Cleaning.
4}, Discretization
‘Ans: b
Explanation: Dimensionality reduction is the process of reducing the number of random variables
under consideration, by obtaining a set of principal variables.
‘Answer Q7-010 based on the following table:
Transaction 1D Temsets
1 {1.2,4, 5)
2 2.3.5)
3 1.2.45)
a (1.23.5)
5 10.2.3,8,5)
5 23,4)
7. Support of rule {4,5} -> {1}
ai
b) Os.
¢) 0.25
4) 0
Ans b
Explanation: support of X-> Vis support(X,¥1}/|T] =3/6= 05.
8. confidence of rule {4,5} -> {1}is:
a) i
b) OS
©) 0.25
d) 0.75
Ans:a
Explanation: Confidence measures the occurrence of products together in a dataset.
Confidence(X->¥) = support {X,¥})/support((X)}=(3/6)/(3/6)
9. Support of (1} > (2,5} is:
a) 2/3
b) 2/2
0 1/4
4) 3/4Ans:
Explanation: support of X-> Y is support(X,¥))/[T] =4/
10. Confidence of {1} > (2,5)
a) 2/3
bp a
0
d) 0s
Ans: b
Explanation: Confidence measures the occurrence of products together in a dataset,
Confidence(X->¥) = supporti{(X,YH)/support((X))=(4/6)/(4/6)=1,Data Mining: Assignment Week 2
LL fa store has N items, the number of possible itemsets is:
A2NI
BM
cw
DNA
2. An association rule is vali iit satisfies
A Support criteria
B,Confidence criteria
€. Both support and confidence criteria
D. None of the above
3. An itemset is frequent if it satisfies the:
A. Support criteria
B. Confidence criteria
. Both support and confidence criteria
D. None of the above
4. Which of the following property is used by the apriori algorithm:
‘A Positive definiteness property of support
B. Positive semidefiniteness property of support
C. Monotone property of support
D. Antimonotone property of support
5. Consider three itemsets
are correct?
bat}. Which ofthe following statements
(bat, ball, wicket}, 2={bat, ball},
A. support(I1) > support(I2)B. support{t2) > support(!3)
C.both statements A and 8
D. none of the statements A and B
For questions 6-10, consider the following small database of four transactions. The minimum support is
{60% and the minimum confidence is 80%,
Trans id Itemlist
1 1F,A,0,8}
2 {D,A,C,,B}
B {CAB 8)
14 {8,40}
6. The 1-itemsets that satisfy the support criteria are:
A.A), {8}, (Ch, (0)
B.{A). (B), {Ch
{8}, (8)
D. None of the above
7. The 2-itemsets that satisfy the support criteria are:
‘A {BC}, (BED, {CE}, {AE}
B. {AB}. {BD}, {AD}
{AE}, (BC)
D.(BC)
8 The 3-itemsets that satisfy the support criteria are:
A {ABC}, {ABE}, {ECD}, {ACD}
B. {ABE}. {BCD}, {ACD}
(ABE, {BCD}
D. {ABD}9. Which of the following is NOT a valid association rule?
AASB
BB>A
cas
D.D>A
10. Which ofthe following is NOT a valid association rule?
AAD DB
8.D>AB
CAD >B
D.DB->AData Mining: Assignment Week 3: Decision Trees
1 Internal nodes of a decision tree correspond to (1 Mark)
A Attributes
8. Classes
C. Data instances
D. None of the above
Ans: A
Explanation: Each internal node of the tree corresponds to an attribute, and each leaf node corresponds to
a class label.
2. Ina multiclass classification problem, Bayes classifier assigns an instance to the class
corresponding to: (1 mark)
A. Highest aposteriori probability
B. Highest apriori probability
C. Lowest aposteriori probability
D, Lowest apriori probability
Ans: A
Explanation: Bayes classifier is also known as MAP (Maximum Aposteriori Classifier.)
3, Three identical bags contain blue and yellow balls. The first bag contains 3 blue and 2 yellow
balls, the second bag has 4 blue and 5 yellow balls, and the third bag has 2 blue and 4 yellow balls.
A bag is chosen randomly and a ball is chosen from it. If the ball that is drawn out is blue, what will
be the probability that the second bag is chosen? (2 Marks)
A, 15/62
8.27/62
c. 10/31
D. None of the above
Ans: C
Explanation: Apply Bayes theorem
P(Ball=Blue ||!) = P(Ball=Blue||1)*P(II)/( P(Ball=Blue|!)*P(I)+ P(Ball=
P(Ball=Blue III) *P(III))
ue |) PII)For questions 4-9, consider the following table depicting whether a customer will buy a laptop or not.
No. | Age Income Student Credit_rating | Buys_Computer
i <=30 High No Fair No
2 <=30 High No Excellent No
3 31-40 High No Fair Yes
4 >40 Medium No Fair Yes
3 >40 Low Yes Fair Yes
6 >40 low Yes Excellent No
7 31-40 low Yes Excellent Yes
8 <=30 Medium No Fair No
9 <=30 Low Yes Fair Yes
10 | >40 Medium Yes Fair Yes
i | <=30 Medium Yes Excellent Yes
2 | 3140 Medium No Excellent Yes
13 ‘| 31-40 high Yes Fair Yes
14 | >40 Medium No Excellent No
4, What is the entropy of the dataset?(1 Mark)
A050
8.0.94
ca
DO
Ans: B
Explanation:Entropy(9,5)
5/14)log(5/14) — (9/14}log(9/14) = 0.94
5. Which attribute would information gain choose as the root of the tree? (1 Mark)
A. Age
B. IncomeStudent
D. Credit_rating
Ans:A
Explanation: From information gain criterion. The Age has the highest information gain.
6. Whether a person will buy if {Age=35, Student=No, Income=Low, Credit_rating = Fair}? (1 Mark)
A.Yes
8.No
C. The example can't be classified
D. Both classes are equally likely
Ans:
Explanation:Build the tree and obtain the classification.
7. What class does the person{Age=42, Student=No, Income=Medium, Credit_rating = Excellent}? (1 Mark)
A. Yes
8.No
C. The example can not be classified
D. Both classes are equally likely
Ans: B
Explanation:Build the tree and obtain the classification.
8, What class does the person {Age=25, Student=No, Income=Medium, Credit_rating = Fair}? (1 Mark)
A. Yes
8.No
C. The example can not be classified
D. Both classes are equally likely
Ans: B
Explanation: Build The Tree
9, Which attribute would information gain choose for Age=31-40? (1 Mark)
A. Student
B. IncomeC.lsa leaf node with value Yes.
D. Credit_rating
Ans:C
Explanation:Construct the tree.Cee ens
Course outing Week 04: Assignment 04
cme carte watt ue on 2023-82-22, 23:90 81.
ween Assignment submitted on 2023-02-21, 10:01 ISTAssignment Week 5: Support Vector Machine
1, Support vector machine is:
A. Maximum aprori classifier
B. Maximum margin classifier
C. Minimum apriori classifier
D. Minimum margin classifier
Answer: B
2. Support vectors in SVM are:
A. Outliers
B, Subset of testing data points
C. Subset of training data points
D. Random points in the data set
Answer: C
3. In a hard margin support vector machine:
A. No training instances lie inside the margin
B. All the training instances
inside the margin
C. Only few training instances lie inside the margin
D. None of the above
Answer: A4, The Lagrange multipliers corresponding to the support vectors have a value:
A. equal to zero
B. less than zero
C. greater than zero
D. can take on any value
Answer: C
5. The primal optimization problem solved to obtain the hard margin optimal
separating hyperplane is:
A. Minimize *%2 W'W, such that y(W'X,+b) 2 1 for alli
B. Maximize ¥ W'W, such that y(W"X.+b) 2 1 for all
C. Minimize ¥2 W'W, such that y(W'X:+b) = 1 for all /
D, Maximize ¥2 W'W, such that y(W'X,+b) = 1 for all /
Answer: A
6. The dual optimization problem solved to obtain the hard margin optimal separating
hyperplane is:
A. Maximize ¥2 W'W, such that y(W'X;+b) & 1- a; for alli
B. Minimize ¥% W'W - 5 a(y(W'X;+b) -1), such that a; = 0, for all/
C. Minimize % W'W - 5 ai, such that y(W"X,+b) < J for all i
D. Maximize 2 W'W + ¥ a, such that y(W'X.+b) < 1 for all i
Answer: 8
7. We are designing a SVM , W°X+b=0, suppose X's are the support vectors and a's
the corresponding Lagrange multipliers, then which of the following statements are
correct:
A.W = SayX,
B. Fayy=
C. Either A or B
D. Both A and B
Ancwer: D8. If the hyperplaneW"X+b=0 correctly classifies all the training points (X, y,), where
y={+1, -1}, then:
A. ||Wel|] = 2
B. X=
C.WX+b 2 0 for alli
D. y(W'X;+b) = 0 for alli
Answer: D
9. The dual optimization problem in SVM design is usually solved using
A. Genetic programming
8. Neural programming
C. Dynamic programming
D. Quadratic programming
Answer: D
10. Slack variables are used in which of the below:
A. Soft margin SVM
8. Hard margin SVM
C. Both in Soft margin SVM and Hard margin SVM
D. Neither in Soft margin SVM nor in Hard margin SVM
Answer: AData Mining: Assignment Week 6: ANN
1. Sufficient Number of output nodes required in an ANN used for two-class
classification problem is:
A. Random number
B, Same as number of input nodes
cal
D.2
Answer :c¢
2. How are the weights and biases initialized in an ANN in general?
A. Can be initialized randomly
B. Always initialized to zero
C. Always initialized to infinity
D. Always initialized as 1
Answer: A
3. In which of the below neural network, the links may connect nodes
within the same layer or nodes from one layer to the previous layers?
A, Perceptron
B, Feed-forward neural network
C, Recurrent neural network
D. Both B, C
Answer: C
4,Neural Networks are complex with many parameters.
A. Linear Functions
8. Nonlinear Functions
C. Discrete Functions
D. Exponential FunctionsAns : A
5. Artificial neural network used for:
A, Pattern Recognition
8B. Classification
C. Clustering
D. All of the above
Ans: D
6. Aneuron with 3 inputs has the weight vector [0.2 -0.1 0.1]*T and a bias 6 = 0. If
the input vector is X= [0.2 0.4 0.2]*T then the total input to the neuron is:
A.0.2
B. 0.02
c.0.4
D.0.10
Ans: B
7. Aneural Network given below takes two binary inputs X1, X2 ¢ {0,1} and the
activation function for each neuron is the binary threshold function (g(a)= 1 if a >0; 0
otherwise). Which of the following logical functions does it compute?
(om —- \—_»y
wee was-3
ae
A. AND
B. NAND
Cc. XOR
D. NOR
Ans: D8. The neural network given bellow takes two binary valued inputs ,, x2¢ {0,1}, the
activation function for each neuron is the binary threshold function (g(a)= 1 if a >0; 0
otherwise). Which of the following logical functions does it compute?
wise 2
wa23--2
e3e4
x
B2=-3
A. AND
B. NAND
Cc. XOR
D.OR
Ans: C
9. The neural network given bellow takes two binary valued inputs x,, x € {0,1} and
the activation function is the binary threshold function (h\z)=1 if 2>0; otherwise). Which
of the following logical functions does it compute?
AOR
C. NAND
D. NOR
Ans: B10. Under which of the following situation would you expect overfitting to happen?
A. With training iterations error on training set as well as test set decreases
B. With training iterations error on training set decreases but test set
increases
C. With training iterations error on training set as well as test set increases
D. With training iterations training set as well as test set error remains constant
Ans: B