Advanced Database Management System Rev 2019me
Advanced Database Management System Rev 2019me
Advanced Database Management System Rev 2019me
1C
24
15
9
Paper / Subject Code: 70623 / Advanced Databased Management System
B0
89
E5
96
31
7C
1C
CA
24
15
9
B0
89
1T00161 - M.C.A. (SEM-I) (Choice Based) ( Two Year Course) (R 2020-21) / 70623 - Advanced
E5
96
31
76
7C
1C
CA
24
15
Databased Management System QP CODE: 10033595 DATE: 21/07/2023
CC
5
6
31
76
C8
4E
9
4F
C
CA
CC
(3 Hours) Total Marks: 80
2
E5
67
11
89
6
9
F
3
5F
7C
C
CA
4
2
E5
90
1
N.B. :1) Question No.1 is compulsory.
89
96
1
6
B0
A3
F
C7
2) Attempt any THREE from the remaining questions.
7C
1C
54
05
15
C
FC
3) Figures to the right indicate full marks.
FE
96
9
31
6
B0
E5
C
CA
4
5
CC
E5
90
24
1
15
31
Q1. Write a short note on:
76
B0
F
89
5F
A
4E
54
C
7C
90
1
5
C
FC
FE
92
1
51
(a) Star and Snowflake schema. [5]
6
0
A3
96
C7
5B
8
54
05
(b) Data preprocessing [5]
7C
1C
C
FC
FE
09
2
51
76
89
6
31
(c) Differentiate between ORDBMS and OODBMS [5]
B
E
54
9
C
7C
C
90
CA
24
15
FC
FE
(d) Text Retrieval Methods 1 [5]
0
9
5
6
31
76
5B
8
54
C9
05
C
7C
CA
4
CC
FC
FE
09
2
51
1
9
6
31
6
5B
C8
4E
54
9
4F
05
7
C
CA
CC
FE
09
92
E5
1
67
1
5
31
(b) Explain the KDD process in detail. [10]
76
5B
8
E
9
4F
05
5F
7C
1C
A
24
C
09
E5
51
90
C
FC
9
96
31
76
5B
C8
4E
B0
5F
C
(a) Differentiate the following, CA
4
Q3. [10]
C
92
E5
51
67
90
1
15
FC
31
1. Agglomerative and Divisive clustering
76
C8
4E
C9
B0
E5
CA
54
5
92
7
0
24
11
15
FC
FE
6
9
C8
C9
B0
89
A3
E5
C7
54
5
(b) Apply Apriori algorithm to the following data set to find out strong association [10]
7C
67
0
24
1
15
C
FC
FE
09
31
rule with Support= 50% and Confidence=60%. 6
C9
89
E5
96
C7
B
CA
54
05
7C
1C
24
11
15
FC
FE
09
76
89
A3
5
96
31
5B
E
54
5
C
7C
1C
0
CA
Transaction ID Items
4
6C
FC
FE
09
2
1
89
5
96
31
5B
E
54
05
7C
1C
CA
4
CC
FC
200 Dairymilk , Chips , cold drink
E
09
92
51
F
96
31
76
5B
8
4E
54
F
05
CC
FE
09
92
5
96
31
76
5B
C8
E
4F
05
C
CA
4
5
CC
09
92
E5
51
7
90
6
31
5B
C8
4E
9
F
5F
C
CA
54
CC
92
51
67
0
11
FE
09
C8
4E
C9
F
A3
5B
54
5
CC
92
7
90
11
6C
FE
6
51
C8
9
B0
A3
7
1C
E
CC
67
90
24
6C
FE
31
51
1. Web mining
C9
B0
CA
4E
54
5
CC
2. Bayesian classification
90
11
5
FE
2
51
76
(b) Explain the Decision tree used in classification. Explain ID3 algorithm with a
B0
4F
89
A3
4E
CC
C
E5
90
suitable example.
5
6C
2
51
67
B0
4F
89
[10]
C7
4E
9
5
C
E5
1C
90
5
FC
2
51
67
B0
89
5F
4E
(a) What is the Abstract Data type. Explain with suitable example.
54
Q6. [10]
9
C
1C
90
5
FE
2
51
67
4E
9
05
C
1C
A
15
09
2
67
6C
89
E5
31
5B
9
C7
C
1C
A
24
*************************
51
67
6C
9
31
C8
4E
9
7
1C
A
CC
92
67
6C
31
C8
C9
F
33595 Page 1 of 1
7
A
54
CC
67
1
6C
FE
31
C9
F
A
54
05
CC
1
6C
FE
31
4F
CA
05
05FE54FCC76CA311C967C8924E515B09
CC
E5