National-Data-Index v1.0 EN

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 277

National Data Index

Version 1.0

1
Document Control

Version Revision Date Contributor Modification

1.0 Oct 2023 SDAIA First Issue

2
Table of Content
1. Document Objective.............................................................................................................4

2. Table of Abbreviations.........................................................................................................4

3. Introduction...........................................................................................................................5

4. NDI Overview........................................................................................................................6

5. Scope.....................................................................................................................................7

6. The NDI Framework..............................................................................................................8

6.1. Benchmarking................................................................................................................9

6.2. Index Design..................................................................................................................9

6.3. Outcomes.....................................................................................................................12

7. Summary..............................................................................................................................16

8. Appendix..............................................................................................................................16

8.1. Appendix I – DM Maturity Assessment Questionnaire.............................................16

8.2. Appendix II – Acceptance Evidence Checklists.......................................................96

8.3. Appendix III – Operational Excellence (OE)............................................................243

3
1. Document Objective
This document aims to present introductory details about the National Data
Index (NDI) for its first round in 2023. It includes the NDI’s overview, scope,
aims, framework, and components with their measurement scores and
assessment levels which are described in the appendix.

2. Table of Abbreviations

Abbreviation Definition

NDL National Data Lake (A National Data Platform).

DM Data Management.

DMP Data Marketplace (A National Data Platform).

DG Data Governance.

DQ Data Quality.

GSB Government Service Bus (A National Data Platform).

KSA Kingdom of Saudi Arabia.

NDC National Data Catalog (A National Data Platform).

NDI National Data Index.

NDMO National Data Management Office.

ODP Open Data Portal (A National Data Platform).

OE Operational Excellence.

PDP Personal Data Protection.

RDP Reference Data Management Platform (A National Data Platform).

SDAIA Saudi Data and Artificial Intelligence Authority.

4
3. Introduction
The Saudi Data and Artificial Intelligence Authority (SDAIA) is on the mission of
developing a data-driven economy and fostering the Kingdom’s data-literacy
level to benefit from data for operational efficiency and insightful decision-
making. From this perspective, SDAIA launched, as part of its initiatives, the
National Data Index (NDI), which is a dynamic and result-oriented monitoring
and evaluation index established to assess the productivity and track the
progress of government entities with regards to the Maturity of the Data
Management (DM) practices, Compliance to Data Management and Personal
Data Protection (DM and PDP) Standards, and Operational Excellence (OE).
The NDI is built to satisfy the data ecosystem requirements of the Kingdom
of Saudi Arabia (KSA) specifically. This comprehensive framework serves as a
reliable metric and an influential indicator, aiding the realization of Vision
2030's objectives by contributing to becoming a smart nation and building a
thriving economy, a vibrant society, and an ambitious nation. The NDI
provides government entities with the enablers that help effectively in
measuring current DM practices and achieving high assessment levels, and
aims high to achieve the following key objectives:

Establish a strong Data Enhance Data Quality


Governance (DG) (DQ) and integrity to
framework & policies to ensure accurate,
govern Data Management complete, and consistent
(DM) practices, gauge DM data.
Maturity and ensure
Compliance.

Accelerate the efficiency Implement data lifecycle


and improve the management processes to
effectiveness of the Data handle data from creation
Management (DM) to disposal in a compliant
operational processes. manner.

Develop mechanisms for auditing to track and monitor adherence to


compliance reporting and regulations.

5
Fo
ste
r a
cul
tur
e
of
Da
ta
Ma
na
ge
me
nt
(D
M)
thr
ou
gh
em
plo
ye
e
tra
ini
ng
pro
gra
ms
an
d
aw
are
ne
ss
ca
mp
aig
ns
for
the
tar
get
au
die
nc
es.

6
4. NDI Overview
SDAIA designed the NDI as a data-specific maturity index based on the
benchmarking research done on global data indices to reach best practices.
For activating the NDI, it is developed based on the following three
components:

• DM Maturity Assessment Questionnaire: Measuring the extent to which


the entity applies best practices in 14 DM domains with regards to
other parties including individuals, technologies, and operational
processes. The output reports also provide recommendations based on
the results to improve the entity’s DM Maturity level.

• DM and PDP Standards Compliance Assessment: Measuring the entity’s


commitment to the adoption and implementation of the Data
Management and Personal Data Protection (DM and PDP) Standards published
by the National Data Management Office (NDMO).

• Operational Excellence (OE) Assessment: Measuring the entity’s progress


level in terms of taking advantage of the National Data Platforms
through assessing the entity’s automated processes and operations of
6 DM Domains so far.

The above NDI components will be measured and audited periodically using
the NDI platform, which is the technological infrastructure of the
assessments. The key NDI platform functionalities include:

• Assessment Execution: The NDI platform facilitates the automated


execution of the measurements, ensuring standardized and efficient
evaluation processes.

• Data Analysis: The NDI platform provides detailed reports on the results
of the NDI components in an interactive and smooth manner.

The NDI platform is a tool designed to facilitate the NDI assessment for the
entity. Furthermore, it covers a range of functionalities that provide real-time
capabilities, enabling the entity to closely track its incremental progress

7
towards attaining a high NDI score.

8
5. Scope
The NDI’s scope of work represents practical auditing procedures to ensure
accuracy in DM practices, integrity, and conformity of the targeted entities.
In its first round, the NDI will cover the following components and DM
Domains:

For the Maturity and Compliance components, the NDI scope covers the
following 14 DM domains:

Reference & Master


Data Governance Data
(DG)
Management (RMD)
Data Catalog & Business Intelligence &
Metadata Analytics
Management (MCM) (BIA)

Data Quality Data Value Realization


(DQ) (DVR)

14 Data Operations
(DO)
Open Data
(OD)
domains Document & Content
Management Freedom of Information
(FOI)
(DCM)
Data Architecture &
Modelling Data Classification
(DC)
(DAM)
Data Sharing &
Interoperability Personal Data Protection
(PDP)
(DSI)

For the OE component, the NDI scope covers the following 6 prioritized DM
Domains, as others shall be added gradually based on readiness:

1. Data Catalog & Metadata


Management 4. Data Sharing & Interoperability
(DSI)
(MCM)

5. Reference & Master Data


2. Data Quality Management
(DQ)
(RMD)

3. Data Operations 6. Open Data


(DO) (OD)

9
This comprehensive approach enhances the empowerment of government
entities, mainly in relation to data-based initiatives, by supporting a culture
of continuous improvement, risk mitigation, and effective Data Governance.
The NDI framework helps in providing the correct understanding of the
strengths and gap fulfilments to improve the DM practices and maximize the
value of the data assets.

6. The NDI Framework


The NDI Framework includes the index components, measurement
methodologies, reporting, audits and governance mechanisms as shown in
the figure below:

The National Data Index (NDI)


Framework
Benchmarki Index Outcom
ng Design es
Global Data Indexes & Regulatory
Standards
1
Index
Components

Maturity Maturity Score


Legislation Theme
s (Levels, s
Band,
Status) DM Maturity DM and PDP Standards
Operational Assessment Compliance
Assessment Excellence (OE) Questionnaire
Assessment

2 Compliance
Score
Measurement Methodology

3
Reporting Design
Shortlisted Data
Indexes

The NDI Framework is based on a systematic approach and methodology


consisting of the following focal elements:

10
6.1. Benchmarking
Benchmarking was a strategic process followed to take into consideration the
lessons learnt from global indices. The benchmarking outputs made SDAIA
gain valuable ideas about the relative ranking of the methodologies and
process efficiencies of these international data-related economic factors. This
comparative analysis served as a crucial starting point for identifying the
best practices that are implemented worldwide in data indices.
Benchmarking was built on various sources such as: industry reports,
surveys, case studies, best practices, themes, dimensions, etc. which were
used to design the NDI.

6.2. Index Design


The design of the NDI covers the various DM dimensions in alignment with
the “DM and PDP Standards”, thus providing a progress roadmap for each
government entity. As shown in the figure below, the NDI was designed by
performing several activities, mainly: determining its components,
formulating its measurement methodology, and designing its reporting
mechanism:

1 2 3

Index Components Measurement Methodology Reporting Design

The definitions of the three NDI • Operationa


Each of the three NDI components has a different
components: l
methodology used to measure each entity’s DM practices.
Excellence
• DM Maturity (OE)
Assessment Assessmen
Questionnaire. t.
• DM and PDP Standards
Compliance
Assessment.

11
The detailed scores and
analysis of the three NDI
components provide insights
and recommendations for the
participating government
entities to facilitate continuous
improvement.

12
The Index Components

The NDI has three components which are defined in the


figure below and described in detail in the appendix. They
will be measured through a comprehensive evaluation of
each entity's Data Management practices. The NDI
components are:

DM Maturity Assessment Questionnaire

• This component measures the entities’ maturity in Data Management (DM).


• It’s shared with the entities which will participate in the NDI measurement to
answer the questions and provide supporting evidences.
• It’s managed through a dedicated measurement platform.
• It covers 14 DM domains published by the National Data Management Office
(NDMO).

The DM and PDP Standards Compliance Assessment

• This component measures the Entities’ Compliance with the 191 Specifications in
the DM & PDP Standards published by the NDMO, all of which were mapped in the
Maturity Questionnaire.
• The Compliance assessment results are obtained by answering the DM
Maturity Questionnaire and completing the acceptance evidences.
• It covers 14 DM domains published by the NDMO.
• The 14 DM domains have scoring weights assigned by SDAIA.

13
The Operational Excellence (OE) Assessment

• This component measures the progress of entities’ processes operationally


which are based on the national platforms / technological solutions such
as NDC, GSB, etc.

The Measurement Methodology

Is developed for each NDI component, as each one has a


different calculation and a separate result used for
measuring its level for each entity:

 The Maturity Assessment: Is calculated using the Maturity questionnaire to


measure the entity’s maturity in 14 of the DM domains.

 The Compliance Assessment: The Compliance score is deduced from the


entity’s answers to the Maturity questionnaire, as the result shows the extent
of the entity’s commitment to implementing the 191 specifications of the DM
domains published by the National Data Management Office (NDMO),
namely, “Data Management and Personal Data Protection (DM and PDP)
Standards”.

 The Operational Excellence (OE) Assessment: Is applicable on 6 of the DM


domains. It measures how progressive the entity is in terms of its DM
operational processes by maximizing the benefit from the National Data
Platforms, such as:

 National Data Lake (NDL).  National Data Catalog (NDC).


 Data Marketplace (DMP).  Government Service Bus (GSB).

14
 Reference Data
Management
platform (RDP).
 Open Data Portal (ODP).

15
Reporting Design

A special mechanism has been designed to generate the


NDI reports, which present detailed results for each of
the index components, and thorough details which allow
the entity to understand its strengths, advancement
opportunities, and areas of continuous improvement in
its DM practices.

6.3. Outcomes
Based on the approved design of the NDI above, the NDI outcomes generate
for each entity three scores which are described below:

The Score of the DM Maturity


Assessment Questionnaire.

The Score of DM and PDP


Standards Compliance
Assessment

The Score of the Operational


Excellence (OE) Assessment

16
6.3.1. The Score of DM Maturity Assessment Questionnaire

The DM Maturity score is derived from the entity’s answers to the


questionnaire. The Maturity assessment has 6 levels, ranging from Level 0
“Absence of Capabilities” to Level 5 “Pioneer”. The entity’s overall maturity
level for its DM practices shall be specified based on the percentage scale as
described in the diagram below:

5
4
3
2
1
0
Maturity
Level

Absence
Establishing Defined Activated Managed Pioneer
of
Capabilitie

Maturity Score
0 – 0.24 0.25 – 1.24 1.25 – 2.49 2.5 – 3.99 4 – 4.74 4.75 – 5
Band

Maturity
0%- 4.9% 5%- 24.9% 25%- 49.9% 50%- 79.9% 80%- 94.9% 95%- 100%
Percentage Scale

Continuous
Institutional
Lack of Centralized improvement
Data processes,
fundament governance is enabled,
Basic data management scalable
al data with end-to- with a strong
management practices tools and
Maturity Level manageme end Key focus on data
practices are have been initial stages
nt Performance innovation
Description capabilities
introduced, developed of
Indicators and a
but they are and automation
, with no (KPIs) and reputation as
not formalized, have been
established metrics are in a benchmark
standardized. ensuring implemented
practices. place. for excellence
consistency to support
in data
and reliability. data
management
No
Maturity Question practices Implementatio KPIs Continuous
No Practices Identificatio Development
Coverage n
n or and improvement
Execution Metrics

Implementin
Managing
Developing
Building g practices and
None best practice Innovation
Objective awarenes
capabilities
and measuring
s standardizin progress
g the
developed

17
6.3.2. The Score of the DM and PDP Standards

Compliance Assessment

The Compliance score is also derived from the entity’s answers to the
questionnaire but based on the acceptance evidences submitted. These
evidences were mapped to the 191 specifications of the DM and PDP
Standards. The NDI defines the following two binary levels of Compliance:

Compliant Non-Compliant

The entity implemented all The entity has not


requirements of implemented all
the specification to achieve requirements of the
specification, thus,

1. Compliant:
• An entity is “Compliant” with a specification if it is fully implemented as
stated in the DM and PDP standards (Acceptance evidences to be
provided where applicable).

2. Non-Compliant:
• An entity is “Non-Compliant” with a specification if it is not fully implemented.

• Partial implementation of a specification is also considered “Non-Compliant”.

18
6.3.3. The Score of the Operational Excellence (OE) Assessment

The OE Assessment component consists of several metrics that are divided


across 6 of the DM domains listed under the scope (i.e., MCM, DQ, DO, DSI,
RMD, OD, and more shall be added subsequently). The metric scores include
different measurement units (e.g., Time, Percentage, etc.). In order to
normalize the measurement units, each metric output is mapped to a
predefined scale that ranges in value from Level 0 “Unacceptable” to Level 5
"Leader” which are described in the following figure:
The Operational Excellence (OE) levels

Unacceptable Low Fair Good Excellent Leader

0 1 2 3 4 5

OE Level OE Level Description

0 Unacceptabl The entity is unable to meet the minimum OE


e threshold.

The entity barely meets the minimum expected OE threshold.


1 Low Most of the operations are ad-hoc or performed with minimal established Standard Operating
Procedures (SOPs).

The entity has started applying the foundational measures and marginally exceeds the minimum
2 Fair expected OE
threshold in Data Management processes and operations.

The entity has a clear understanding of its objectives in the context of Data
3 Good Management operations.
The entity applies advanced tools and techniques to improve DM.

The entity has the ability to consistently exceed the expected OE threshold and Key
4 Excellen Performance Indicators
t (KPIs).
The entity has established a culture of continuous improvement.

The entity has a strong focus on innovation, creativity and acts as a front runner in adopting
5 Leade cutting-edge
r
processes, techniques and methods.

19
7. Summary
In conclusion, SDAIA developed and launched the National Data Index (NDI),
with its three component assessments which complement each other:
Maturity, Compliance, and OE, in pursuit of improving the DM practices and
processes in government entities and maximizing the value from data as a
national asset. This contributes to leveraging data in informed decision-
making, optimizing operational efficiency and achieving the national goal of
becoming a leader in data-driven economies globally. Furthermore, OE
increases the entity’s progress in the DM processes and operations with the
help of the National Data Platforms.

8. Appendix
This section consists of the detailed description of each NDI component
which will have a separate score based on different measurements, thus
contributing to the NDI outcomes:

8.1. Appendix I – DM Maturity Assessment


Questionnaire
This section provides a detailed questionnaire, which contains 42 questions
aimed at evaluating entities’ Data Management practices across 14 domains.
Each domain has specific questions with maturity levels ranging from level 0
(Absence of capabilities) to 5 (Pioneer).

To demonstrate commitment to Data Management practices, the entities


must provide SDAIA with acceptance evidences for each maturity level. This
include a range of evidences, such as policies, reports, process
documentation, implementation records, and other relevant artifacts. This
will also help ensure compliance with Data Management and Personal Data
Protection (DM and PDP) Standards.

20
8.1.1. Data Governance Domain

Maturity Questions – Data Governance Domain

Has the Entity established & implemented a Data Management & Personal Data Protection (DM & PDP) Strategy and a DM
DG.MQ.1
& PDP Plan with Key Performance Indicators (KPIs) that can be continuously measured to ensure optimization?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No Data Management & Personal Data Protection (DM & - Not Applicable.
Absence of PDP) strategy is in place.
Capabilities

Level 1: - The Entity has existing DM practices that are not formalized - Existing data-related
Establishing (not approved). practices.

Level - A cross-Entity DM & PDP strategy is defined for all DM - The approved DM & PDP - DG.1.1
2: Domains. Strategy.
Define
d - DM Guiding principles have been developed.

- The DM guiding principles. - DG.1.2

- Data Strategy Approval - DG.1.4


Decision.

Level 3: - The Entity developed a DM & PDP plan based on the All level 2 acceptance evidence - DG.1.3
Activate defined DM & PDP strategy. requirements including:
d
- The Entity is implementing this DM & PDP plan which - The developed DM & PDP
covers all DM Domains. implementation plan.

- Implementation status report.

Level 4: - The Entity is monitoring the DM & PDP strategy & plan All level 3 acceptance evidence
Manage implementation across all DM Domains with the pre-defined requirements including:
d KPIs.
- The monitoring report of the
DM & PDP strategy & plan
implementation with the
pre- defined KPIs.

21
Maturity Questions – Data Governance Domain

Has the Entity established & implemented a Data Management & Personal Data Protection (DM & PDP) Strategy and a DM &
DG.MQ.1
PDP Plan with Key Performance Indicators (KPIs) that can be continuously measured to ensure optimization?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - The DM & PDP strategy is periodically updated, - All level 4 acceptance - DG.6.1
5: improved and optimized across all DM Domains. evidence requirements
Pionee including:
r - The DM & PDP implementation plan is periodically
reviewed and updated based on both the changes to - The Continuous
the DM & PDP strategy and the changing business Improvement Report of
requirements. the DM & PDP Plan.

- The Continuous - DG.7.2


Improvement Report of
the DM & PDP Strategy.

Maturity Questions – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No Data Management (DM) Policies, Standards or - Not Applicable.


Absence of Guidelines are in place.
Capabilities

Level 1: - The Entity has existing DM-related policies and standards - The Data Management & - DG.2.1
Establishing that are not formalized. Personal Data Protection (DM
& PDP) Policies, Controls and
- The Entity conducted a gap analysis to identify the Guidelines Gap Analysis
required DM and PDP policies to be developed document.

Level - The Entity developed policies, standards and guidelines for All level 1 acceptance evidence - DG.2.2
2: all DM Domains. requirements including:
Define
d - The developed DM and PDP
policies, standards and
guidelines covering all DM
Domains as required.

22
Level 3: - The Entity is implementing the developed DM and PDP All level 2 acceptance evidence
Activated policies with defined processes and standards to requirements including:
establish governance, uniformity and manage
compliance across the Entity. - Implementation status report.

Maturity Questions – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Related
Level Name Level Description Acceptance Evidence
Specification

- A document proving the Entity's


approval & adoption of the
developed policies, standards &
guidelines.

- Approved Compliance - DG.5.1


Management
Framework.

Level 4: - The DM and PDP policies, standards and processes are All level 3 acceptance evidence - DG.7.1 (3
Manage tracked and measured based on the pre-defined KPIs. requirements including: & 6)
d
- Periodic monitoring is conducted on the Entity's - The monitoring reports for the
compliance with the Regulations published by NDMO- developed policies, processes
SDAIA. and standards with pre-defined
KPIs.

- The Entity's Compliance Audit - DG.5.2


Results Report.

- Compliance Monitoring Report. - DG.5.3

Level - The DM and PDP policies, standards, DG processes and All level 4 acceptance evidence
5: compliance with regulations are continuously reviewed requirements including:
Pionee and optimized.
r - Continuous Improvement
Report of the updated DM and
PDP policies, standards and
updated DG processes.

23
Maturity Questions – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No formal/approved Data Management Organization - Not Applicable.


Absence of structure is in place.
Capabilities

Level 1: - The Entity established a Data Management Office with - The Entity's Data Management - DG.4.1
Establishing one or two roles only. Office establishment decision.

- The Hiring / Appointment - DG.4.3


decisions of the following
roles:

A. Chief Data Officer (CDO)


Hiring
/ Appointment Decision.

- The Hiring / Appointment - DG.4.4


decisions of the following roles:

B. Data Management Officer /


Data Governance Officer
Hiring
/ Appointment Decision.

Level - The Entity "Data Management & Data Governance All Level 1 acceptance evidence - DG.4.2
2: Committee" (An internal executive committee) is requirements including:
Define established with a total of six roles (out of the DM
d Office roles) assigned & operationalized for the Data - Entity Data Management & Data
Management Organization. Governance Committee formation
decision.

- The Hiring / Appointment - DG.4.6


decisions of the following
roles:

A. Compliance Officer
Hiring / Appointment
Decision.

24
Maturity Questions – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Hiring / Appointment - DG.4.8


decisions of the following roles:

B. Business Data Executive


Hiring
/ Appointment Decision.

- The Hiring / Appointment - DG.4.11


decisions of the following roles:

C. Legal Advisor Hiring


/ Appointment
Decision.

Level 3: - All DM roles are established as per the DM and PDP All Level 2 acceptance evidence - DG.4.9
Activate requirements including:
Standards and fully adopted across the Entity.
d
- The Hiring / Appointment
- Data Stewardship / Ownership defined across the
decisions of the following
Entity, in addition to providing business and technical
roles:
support for critical Data systems & critical Data
elements.
A. Business Data
Steward(s) Hiring /
Appointment Decision

- The Hiring / Appointment - DG.4.10


decisions of the following roles:

B. List of the IT
Data Stewards.

- The Hiring / Appointment - DG.4.7


decisions of the following roles:

C. Personal Data
Protection (PDP)
Officer Hiring /
Appointment
Decision.

Maturity Questions – Data Governance Domain

25
Has the Entity established and operationalized all roles required for the Data Management Organization as per the
DG.MQ.3
NDMO Controls & Specifications?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Hiring / Appointment - DG.4.5


decisions of the following roles:

D. Open Data and


Information Access
Officer Hiring /
Appointment Decision.

- The documented & approved Data


Management Organization
structure.

- The documented Data


Stewardship / Ownership
structure.

Level 4: - The Data Management Organization & the structure All Level 3 acceptance evidence - DG.7.1 (KPI
Manage are fully established and operationalized. requirements including: 1)
d
- The roles are tracked through pre-defined KPIs, - The monitoring reports for the
periodically updated and optimized. Entity’s Data Management
Organization roles with pre-
defined KPIs.

Level - Data Management Organization (& its structure) and All level 4 acceptance evidence
5: Data stewardship (& its structure) (Data owners, requirements including:
Pionee business and technical stewards) are periodically
r reviewed and updated. - The Continuous Improvement
Report for the DM Organization
and Data Stewardship.

26
Maturity Questions – Data Governance Domain

Has the Entity established and implemented practices for Change Management including awareness, communication,
DG.MQ.4
change control, and capability development?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No formal Change Management practices are in place. - Not Applicable


Absence of
Capabilities

Level 1: - Awareness sessions, training courses and - Evidences of communication on


Establishing communication on DM-related practices are DM- related practices.
performed on a reactive and ad-hoc basis.

- The Entity has change control practices that are not - Evidences of training and
formalized.
awareness sessions conducted.

Level - The Entity has defined Change Management practices. All level 1 acceptance evidence
2: requirements including:
Define - There is an Entity-wide awareness, training and
d communication plan. - The Change Management Plan
covering all DM Domains
- Change control practices are established to ensure including:
coordinated efforts in managing changes on Data
systems (implemented within the Entity). A. The DM & PDP Training
Plan for all DM Domains.

B. The DM
Communication
plan.

C. The
Stakeholders
engagement
plan.

D. The Change Control plan


for Data system
changes.

Level 3: - The Entity is implementing the defined Change - All level 2 acceptance - DG.3.1
Activated Management practices for all DM Domains. evidence requirements
including:
- Formal communication channels are established and
adopted for DM issues management, escalations, - The Change Management
resolutions & approvals. Implementation Status
Report showing the DM &
PDP Training activities.

27
- The Change Management - DG.6.2
Implementation Status Report

Maturity Questions – Data Governance Domain

Has the Entity established and implemented practices for Change Management including awareness, communication,
DG.MQ.4
change control, and capability development?

Related
Level Name Level Description Acceptance Evidence
Specification
- A Version control mechanism is defined and showing the DM Communication
implemented for DM documents and artifacts that activities.
the Entity published.

- The DM & PDP Strategy is being socialized and


awareness is conducted across the Entity. - The Stakeholders Engagement - DG.1.4
and Socialization plan
implementation status report.

- The Data Governance Approvals - DG.8.1


Register.

- The Data Management Issue - DG.8.2


Tracking Register.

- Evidences that the Entity has DM - DG.8.3


& DG document & artifact
Version Control practices.

Level 4: - The Entity is monitoring the progress and the All level 3 acceptance evidence - DG.7.1
Manage effectiveness of the Change Management practices requirements including: (KPIs 2, 4,
d with pre-defined KPIs. 5, 7 & 8)
- The monitoring report of the
Change Management practices
with pre- defined KPIs.

Level - Change Management practices for all DM Domains are All level 4 acceptance evidence
5: periodically reviewed, and continuously updated, requirements including:
Pionee optimized & developed.
r - The Continuous Improvement
Report of the Change
Management Practices for all DM
Domains.

28
8.1.2. Metadata and Data Catalog Domain

Maturity Questions – Metadata and Data Catalog Domain

MCM.MQ.1 Has the Entity developed and implemented a plan to integrate and manage Metadata across the Entity?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place to integrate and manage Metadata in - Not Applicable.


Absence of the Entity.
Capabilities

Level 1: - The Entity manages its Metadata on a reactive and ad hoc - The Recorded or Documented
basis.
Establishing Metadata Report.

Level - The Entity developed a plan, an operating model and a - The Approved Metadata and Data - MCM.1.1
2: structure for the implementation of Metadata and Data Catalog Plan.
Define Catalog.

d
- The Metadata management structure should include
- The Approved Metadata - MCM.4.3
the requirements to capture, integrate, populate,
Structure and Framework.
publish and utilize the required technology solution for
the Entity's Metadata (business and technical).

- A roadmap is defined with a list of activities, required


resources, and a budget to manage the Data Catalog
and Metadata implementation.

Level 3: - The Entity is implementing the defined Metadata All level 2 acceptance Evidence
Activated and Data Catalog plan and framework. requirements, Including:

- This implementation includes the identification and - The Metadata and Data Catalog
prioritization of the critical systems and attributes. Plan Implementation Status
Report.

- The Metadata Management


Framework Implementation
Status Report.

29
Maturity Questions – Metadata and Data Catalog Domain

MCM.MQ.1 Has the Entity developed and implemented a plan to integrate and manage Metadata across the Entity?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring the effectiveness of the All level 3 acceptance Evidence
Manage Metadata and Data Catalog plan and activities with pre- requirements, Including:
d defined Key Performance Indicators (KPIs).
- The Monitoring Report with Pre-
defined Key Performance
Indicators (KPIs) for the
Metadata and Data Catalog
Plan and Activities.

Level - The Metadata and Data Catalog plan and activities are All level 4 acceptance Evidence
5: regularly monitored for continuous improvement and requirements, Including:
Pionee optimization.

r - The Continuous Improvement


Report on the Metadata and Data
Catalog Plan.

Maturity Questions – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No tools are in place to manage the Entity's Metadata. - Not Applicable.
Absence of
Capabilities

Level 1: - The Entity manages its Metadata manually or in standalone - Evidences of Existing Metadata.
Establishing applications without standardized tools.

Level - The Entity documented its requirements for the Data - The Selected Data Catalog Tool.
2: Catalog tool

30
Define / solution and selected an appropriate tool based on
d business and regulatory requirements. - The Approved and Prioritized Data - MCM.1.2
Sources Report.
- The Entity's data sources to be included in the Data
Catalog tool have been identified and prioritized.

Maturity Questions – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity developed a target Metadata architecture.
- The Developed and Approved - MCM.1.3
Target Metadata Architecture.

- The Data Catalog Tool


Implementation Requirements
Report.

- The Approved and Developed


Data Catalog Training Plan.

Level 3: - The Data Catalog tool has been implemented and All level 2 acceptance Evidence - MCM.5.1
Activated populated with the Entity’s Metadata. requirements, Including:

- The tool is integrated into the Entity’s Metadata - Evidence of the Implemented
repository with appropriate access permissions. Data Catalog Tool.

- The tool stores the activities and tracking logs for Audit
Trails.
- Data Access Approval Process - MCM.2.1
Documentation (Authorization
- Power users are identified and trained on the Data to Connect the Data Catalog
Catalog tool usage. with the Data Sources).

- A training and communication plan is rolled out


to raise awareness and adoption of the Data
- Metadata Access Approval - MCM.2.2
Catalog tool.
Process Documentation.

- The Data catalog automated tool is regularly updated to


the latest version.
- Evidences of Data Catalog - MCM.3.2
Adoption and Usage Including
Metadata Populated on the Tool.

31
- The Regular Audits Report on - MCM.5.3
the Data Catalog Usage.

- Evidence of Training Conducted - MCM.3.1


for the Identified Data Catalog
Users.

- Tool Versioning Report. - MCM.5.4

Maturity Questions – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring changes to its Metadata All level 3 acceptance Evidence - MCM.6.1
Manage through automated notifications. requirements, Including:
d
- The adoption and usage of the implemented Data Catalog - The Monitoring Report with Pre-
tool is closely monitored based on a set of pre-defined defined KPIs for the Adoption
Key Performance Indicators (KPIs). and Usage of the Metadata &
Data Catalog Solution / Tool.

- The Approved List of Pre-


defined KPIs for the Quality of
the Metadata which is
Populated on the Data Catalog
Tool.

Level - The Entity has fully automated end-to-end Metadata All level 4 acceptance Evidence
5: (Business, Technical, and Operational) capture and requirements, Including:
Pionee exchange.
r - The Continuous Improvement
- The Entity continuously monitors and optimizes the Data Report on the Metadata and Data
Catalog. Catalog Tool's quality.

- The Entity continuously updates the Data Catalog in the


event of any changes specified by the National Data
- The Metadata Management
Bank (NDB).
Automation Report.

Maturity Questions – Metadata and Data Catalog Domain

32
Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3
population, access management, and quality issue management, etc., supported & fostered by collaboration across the Entity?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No process is in place to manage Metadata. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity manages the Metadata on a reactive and ad hoc - Evidence of the Existing Processes
Establishing basis (e.g. Projects / initiatives) without formalized Used to Manage Metadata.
practices.

Maturity Questions – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3
population, access management, and quality issue management, etc., supported & fostered by collaboration across the Entity?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - The Entity defined processes to manage the Metadata - Metadata Identification Process
2: across the business and technical applications. Report.
Define
d - The Entity developed processes to prioritize, populate
and manage the Metadata including access - Metadata Prioritization Process
management, annotation, certification, and Metadata Report.
quality; by collaboration across the Entity.

- Metadata Population Process - MCM.4.2


Report.

- Metadata Update process Report. - MCM.4.4

- Metadata Quality process Report. - MCM.4.5

- Metadata Annotation process - MCM.4.6


Report.

33
- Metadata Certification process - MCM.4.7
Report.

Level 3: - The Entity is implementing defined processes for All level 2 acceptance Evidence
Activated end-to-end Metadata management. requirements, Including:

- The established processes for updating the - Evidence of the Implementation


Metadata are implemented and automated as and Adoption of the Approved
workflows within the Data Catalog tool. Processes as Workflows in the
Entity's Data Catalog.
- The Integrated Metamodel is deployed with role-based
access.

- The Logs or the List of - MCM.5.2


- The Metadata quality issues are identified and addressed.
Notifications on the Metadata
Changes.

Maturity Questions – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3
population, access management, and quality issue management, etc., supported & fostered by collaboration across the Entity?

Related
Level Name Level Description Acceptance Evidence
Specification
- Full end-to-end Metadata is collected and enriched.
- Evidences of Communications to
- All the Data Governance and Data Management teams the Data Catalog Users of any
along with business stakeholders are engaged Metadata Update.
collaboratively through the automated tool.

- The Metadata Stewardship - MCM.4.1


Coverage Model.

Level 4: - The Entity is monitoring and tracking the Metadata All level 3 acceptance Evidence
Manage processes and activities with pre-defined Key Performance requirements, Including:
d Indicators (KPIs).
- The Metadata Processes
Monitoring Report with Pre-
Defined KPIs.

34
- The Metadata Quality - MCM.6.2
Monitoring Report with Pre-
Defined KPIs.

Level - The Entity's Metadata management processes and All level 4 acceptance Evidence
5: practices are regularly monitored for continuous requirements, Including:
Pionee improvement and optimization.

r - The Continuous Improvement


- Optimization techniques are being utilized to improve Report on the Metadata
the processes of developing taxonomies, ontologies, Management Practices.
or semantic representations.

35
36
8.1.3. Data Quality Domain

Maturity Questions – Data Quality Domain

DQ.MQ.1 Has the Entity developed and implemented a Data Quality (DQ) plan focused on improving the quality of the Entity's Data?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No formal Data Quality (DQ) plan is in place. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity implements Data Quality activities on a reactive - Evidence of the Existing
Establishing or ad hoc basis. Data Quality (DQ) Related
Activities.

Level - The Entity has a defined and approved DQ management - The Defined and Approved DQ - DQ.1.2
2: plan. Implementation Plan.
Define
d - A roadmap is defined with a list of activities, required
resources, and a budget to manage the DQ
implementation.

Level 3: - The Entity is implementing the defined DQ plan and All level 2 acceptance Evidence
Activate roadmap. requirements, Including:
d
- Clear roles and responsibilities are defined for the DQ - A Report on the DQ Plan
activities in line with the Data Management Organization Implementation Status.
structure.

- A Report on the Defined DQ


Roles & Responsibilities.

- A Report on the Assigned


Resources for the DQ Plan.

Level 4: - The Entity is monitoring the implementation of the DQ plan All level 3 acceptance Evidence
Manage and activities with the pre-defined Key Performance requirements, Including:
d Indicators (KPIs).
- The Monitoring Report of the
DQ Plan and Activities with
Pre-defined Key Performance
Indicators (KPIs).

37
Maturity Questions – Data Quality Domain

DQ.MQ.1 Has the Entity developed and implemented a Data Quality (DQ) plan focused on improving the quality of the Entity's Data?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - The DQ plan and activities are regularly monitored for All level 4 acceptance Evidence
5: optimization and continuous improvement. requirements, Including:
Pionee
r - DQ Management is embedded in the Data Lifecycle and - The Continuous Improvement
Software Development Lifecycle (SDLC). Report of the Data Quality
Plan & Activities.

Maturity Questions – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices are in place for Data Quality (DQ). - Not Applicable.
Absence of
Capabilities

Level 1: - The Entity has existing DQ-related practices that - The Existing DQ Domain
Establishing are not formalized. Initiatives.

- The identification of Data issues and the resolution - The Existing Processes for
activities are done on a reactive or ad hoc basis. Detecting DQ Issues.

- The Existing Processes Used


for Data Corrections or Data
Validations.

Level - The Prioritized List of Data - DQ.1.1


2: Elements.
Define - The Entity prioritized Data elements based on the
d business requirements from the perspective of its
importance for the DQ Management. - The Defined DQ Dimensions for - DQ.2.1
the Entity's Datasets.
- The Entity defined DQ Dimensions for the prioritized
Datasets.

- The Data Quality Rules Report. - DQ.2.1

38
Maturity Questions – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity defined and documented DQ Rules including
formats, mandatory fields, validations for Data-entry - The Developed & Approved - DQ.2.3
values. Processes for DQ Issue
Management & Remediation.
- The Entity established DQ issue management
processes to identify and resolve the DQ issues.

Level 3: - The Entity is implementing DQ Management. All level 2 acceptance Evidence - DQ.1.3
Activated requirements, Including:
- An initial DQ assessment was conducted to identify DQ
issues. Periodic DQ assessments are established to - The Planned & Conducted Initial
ensure the improvement on the Entity's DQ. & Periodic Data Quality
Assessment Report.
- Data issues were identified, a root cause analysis was
conducted and remediation activities were developed
accordingly to correct the identified Data issues. - The Resolution Status Report of - DQ.2.3
the Identified DQ Issues.
- Standardized DQ Tools are implemented with
automated workflows and defined DQ Rules.

- Evidences of DQ Tools Used for - DQ.2.5


Automating DQ Issue
Management Workflows.

- The Defined & Implemented DQ - DQ.2.4


Service Level Agreements (SLAs).

- The List of DQ System


Enforcements Adopted by the
Entity.

Level 4: - The Entity is monitoring the effectiveness of the DQ All level 3 acceptance Evidence
Manage management practices with pre-defined KPIs. requirements, Including:
d
- The Entity is monitoring the agreed upon DQ - The Monitoring Report of the
threshold values with pre-defined SLAs. DQ Management Practices
with Pre- defined KPIs.
- The Entity is monitoring the effectiveness of the issue
resolution process.
- The Monitoring Report of the DQ
Threshold Values.

39
- The Monitoring Report of the - DQ.3.2
DQ Issue Resolution Process.

Maturity Questions – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - Proactive profiling and critical Dataset identification are All level 4 acceptance Evidence
5: done using Advanced Artificial Intelligence / Machine requirements, Including:
Pionee Learning (AI/ML) based methods.
r - The Continuous Improvement
- The DQ tools are continuously optimized to improve Report on the Tools Used for
the DQ management processes. DQ.

- Proactive remediation is done with proper monitoring


- The Continuous Improvement
of critical Data systems, Data entities (Data objects)
Report on the DQ
and Data attributes.
Management Practices.

- The Entity implemented DQ Management industry


standards or developed Entity-specific standards.
- The Implemented / Adopted DQ
Industry Standards.

Maturity Questions – Data Quality Domain

DQ.MQ.3 Has the Entity established and implemented practices to monitor and report the Entity's Data Quality (DQ) status?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices are in place to monitor nor report on the - Not Applicable.
Absence of quality of Data.
Capabilities

Level 1: - The Entity monitors its DQ on a reactive or ad hoc basis. - The Existing DQ Monitoring
Establishing Practices.

40
- Evidence of the Entity's
Current DQ Status.

Maturity Questions – Data Quality Domain

DQ.MQ.3 Has the Entity established and implemented practices to monitor and report the Entity's Data Quality (DQ) status?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - The Entity defined and formalized DQ monitoring and - The Defined and Formalized - DQ.2.2
2: reporting practices to maintain and document the DQ DQ Monitoring Plan.
Define activities on a regular basis.

d
- The Entity developed DQ processes and a plan for - The Defined DQ Checkpoints
conducting DQ Audits at defined checkpoints. Report.

Level 3: - The Entity is monitoring the quality of its All level 2 acceptance Evidence - DQ.2.2
Activate Data and documenting the quality status requirements, Including:
d on a regular basis.
- DQ Scorecards or Dashboards.
- DQ reviews are conducted at the defined DQ
Checkpoints.

- A Report on the Data Quality - DQ.4.3


- A workflow for reporting DQ issues has been
Metadata Logged on the Data
implemented on the Data Catalog automated tool.
Catalog Tool.

- The DQ Rules and the DQ Monitoring results are


documented as the Metadata within the Data Catalog
- Evidence of a Data Quality - DQ.4.2
automated tool.
Support Process
Implemented as a Workflow.

- The Results of the DQ - DQ.4.1


Checkpoint Reviews.

41
Level 4: - The Entity monitors the DQ trends and reporting All level 3 acceptance Evidence - DQ.3.1
Manage activities based on pre-defined KPIs and established requirements, Including:
d targets.
- Trends from the DQ
Monitoring Activities with
Pre-defined KPIs.

Level - The Entity conducts periodic reviews of the monitoring All level 4 acceptance Evidence
5: and reporting practices for continuous improvement in requirements, Including:
Pionee line with the changes in business and technical
r requirements. - A Continuous Improvement
Report on the DQ Monitoring
and Reporting Practices.

Maturity Questions – Data Quality Domain

Has the Entity developed Data Quality (DQ) standards, provided definitions for its Datasets, and published / uploaded the
DQ.MQ.4
definitions on the National Data Catalog (NDC)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No standards nor definitions are in place for the Entity's - Not Applicable.
Absence of Datasets.
Capabilities

Level 1: - The Entity defines its Datasets and DQ standards on an ad - The Existing List of Data
Establishing hoc / project basis. Standards and Data
Definitions.

Level - The Entity has Data definitions and standards for - The Developed Data Standards
2: a set of domains / attributes in line with the DQ for Data Elements.
Define Dimensions.

d
- The Entity identified and provided Metadata
- The Identified Metadata with
definitions for all Data attributes in line with the DQ
their Definitions.
definitions.

42
- The Entity identified the Datasets for which Metadata - The Identified Datasets to be
should be published on the National Data Catalog Published on the National Data
(NDC). Catalog (NDC).

Level 3: - The Entity has been on boarded on the NDC. The All level 2 acceptance Evidence
Activated Entity populated the Metadata definitions on the requirements, Including:
NDC.
- A Report about the Data
- The Entity is developing and implementing Standards & the Data
standards independently for the Entity-specific Definitions which the Entity
domains / attributes. Uploaded on the NDC.

- The Entity uploads the Metadata on the NDC as required.


- The Entity-Specific List of
Applied Data Definitions and
Applied Data Standards.

Maturity Questions – Data Quality Domain

Has the Entity developed Data Quality (DQ) standards, provided definitions for its Datasets, and published / uploaded the
DQ.MQ.4
definitions on the National Data Catalog (NDC)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity's Data definitions and Data standardization All level 3 acceptance Evidence
Manage processes are reviewed and tracked through pre- requirements, Including:
d defined KPIs.
- A Monitoring Report for the
Data Definitions and Data
Standardization with Pre-
defined KPIs.

Level - The standards are developed proactively for all the All level 4 acceptance Evidence
5: new and future systems. The standards are requirements, Including:
Pionee continuously improved across the Entity's Data
r systems. - A Continuous Improvement
Report for Optimizing the
- Regular assessments are conducted to ensure that the Data Standards and
standards and definitions are provided and Definitions within the Entity
implemented for the Entity's Datasets. and on the NDC.

43
8.1.4. Data Operations Domain

Maturity Questions – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and Data
DO.MQ.1
retention?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place for Data Operations (DO), Data storage, - Not Applicable.
Absence of Data retention and disaster recovery (DR).
Capabilities

Level 1: - The Entity has existing practices for Data Operations, - The Approved initial data
Establishing storage and retention. However, this plan is setup in a operations and storage plan.
reactive form on an ad hoc basis.

Level - The Entity has a defined and approved Data Operations - The Developed and Approved - DO.1.1
2: (DO) and storage plan and roadmap including: Data Operations and Storage
Define Plan.
d  Data storage forecasting.

 Budget and resource allocation.


- The Information Systems - DO.1.3
 Information systems prioritization. Priority List.

- This DO plan is in line with:


- The Developed and Approved - DO.2.1
 The policy development. Policies for Data Operations,
Storage, Retention and
 Tool selection.
Business Continuity.
 Business continuity.

 Plans for backup and disaster recovery.


- The Periodic Forecasting Plan - DO.1.2
for Storage Capacity.

- The Database Technology - DO.1.4


Evaluation Process and
Selection Criteria.

Level 3: - The Entity is implementing its defined Data Operations All level 2 acceptance Evidence
Activated (DO) plan. requirements, Including:

- The Entity assigned & activated the roles and


responsibilities to manage the DO, storage and
retention in line with the business

44
Maturity Questions – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and Data
DO.MQ.1
retention?

Related
Level Name Level Description Acceptance Evidence
Specification
requirements and the overall "Data Management - The Data Operations and
Organization" structure. Storage plan with
implementation status
- The Entity is reporting on the Database performance. report.

- The Entity is conducting periodic forecasts of its storage


capacity.
- The Storage Trend Forecast - DO.1.2
Document.

- The Database Technology - DO.1.4


Evaluation Report.

- The Document on the Entity's


Application Performance
Assessment.

- The Entity's Applications


Development Roadmap with
the Status Report on the
Implementations.

- The Budget Estimations of


the Procurement
Transactions of the Future
Storage Needs.

- The Data Operations


Orchestration Document.

Level 4: - The Entity is monitoring the progress and effectiveness of All level 3 acceptance Evidence - DO.5.1
Manage the Data Operations (DO) plan implementation with pre- requirements, Including:
d defined Key Performance Indicators (KPIs).
- The Monitoring Report with
Pre- defined Key
Performance Indicators
(KPIs) for the
Implementation Progress and
Effectiveness of the Data
Operations Plan.

Maturity Questions – Data Operations Domain

45
Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and Data
DO.MQ.1
retention?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - The Entity's Data Operations (DO), storage, retention and All level 4 acceptance Evidence
5: recovery practices are continuously reviewed and requirements, Including:
Pionee optimized.
r - The Continuous Improvement
Report on the Practices of
Data Operations, Storage,
and Retention.

Maturity Questions – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for Database
DO.MQ.2
operations?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices nor processes are in place for Database - Not Applicable.
Absence of management and operations.
Capabilities

Level 1: - The Entity conducts Data Operations but not in a - Evidences of Data Operations
Establishing standardized approach. Done within the Entity.

- Operations are being managed without clear procedures


or with minimal documented procedures.

Level - The Entity has defined and developed practices and - The Process Documentation of - DO.3.1
2: processes for Database operations management, the Detailed Practices of Data
Define monitoring and storage configurations including access Operations, Including:
d control processes for the Databases.
A. Database Monitoring.
- All roles and support levels are identified to
manage the Database Operations.
B. Database Access - DO.3.2

- Processes and Standard Operating Procedures Control.


(SOPs) are documented with clearly defined roles
and responsibilities across the operations and the
IT teams.
C. Storage Configuration - DO.3.3
Management.
- The Database Management System (DBMS) / tool is
regularly updated to the latest version.

- Service Level Agreements (SLAs) are developed to D. DBMS - DO.3.4


guide the Database operations. Versioning
Mechanism.

46
Maturity Questions – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for Database
DO.MQ.2
operations?

Related
Level Name Level Description Acceptance Evidence
Specification

E. The Database
Performance Service
Level Agreements and
Operational Level
Agreements (SLAs &
OLAs).

Level 3: - The Entity is implementing the defined Practices and All level 2 acceptance Evidence - DO.3.1
Activated Processes of Database operations management and requirements, Including:
Storage Configurations.
- The Database Monitoring
Status Report.

- Under these Processes within the Entity, there are


Standard Operating Procedures (SOPs) include: - The Data Operations operating
model.
 The Database performance
monitoring and reporting.

 Role-based access control. - Evidences of Agreements - DO.3.5


 Storage Configuration Management and (SLAs and OLAs).
DBMS versioning (Implementation of
Database changes on the Production
Environments).

- The Standard Operating Procedures (SOPs) are activated:

 For all support levels,


communication, escalation and
resolution.

 With all the involved stakeholders.

- All required Data Operations (DO) roles are


activated and onboarded.

- The Service Level Agreements (SLAs) and the


Operational Level Agreements (OLAs) are enforced as
agreed upon with business stakeholders.

Level 4: - The Entity is monitoring the DO processes and SOP with All level 3 acceptance Evidence
Manage pre- defined Key Performance Indicators (KPIs) and requirements, Including:
d thresholds.
- The Monitoring Report with Pre-
defined KPIs for Database
Operations Management.

47
Maturity Questions – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for Database
DO.MQ.2
operations?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - DO and Storage practices and processes are All level 4 acceptance Evidence
5: continuously reviewed and optimized. requirements, Including:
Pionee
r - Leading DO practices are evaluated on a regular - The Continuous Improvement
basis and adopted (e.g., DataOps). Report on the Implemented
Practices and Processes of
- All operational metrics covering SLAs / OLAs, and Key Data Operations and Storage.
Performance Indicators (KPIs) / Key Quality Indicators
(KQIs) are reviewed proactively to improve and optimize
the operations. - The Knowledgebase document.

- Proactive / preventive measures are planned and


executed to ensure uninterrupted operations.

- All learning scenarios and experiences cases are


documented and updated regularly to create a
knowledgebase for reusability and future planning.

Maturity Questions – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR) and
DO.MQ.3
a defined Business Continuity Plan (BCP) for the Data?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plans, practices or processes are in place for business - Not Applicable.
Absence of continuity such as backups and disaster recovery.
Capabilities

Level 1: - The Entity's Data backup and disaster recovery practices - Evidences of Data Storage
Establishing are reactive and on an ad hoc basis. Backup Instruction Documents.

Level - The Entity has defined and developed plans, practices - The Developed Processes and - DO.4.1
2: and processes for business continuity, Data backup, Practices for Data Storage
Define and disaster recovery in line with the DM and PDP Backups and Recovery.
standards.

48
d
- The Recovery Time Objective (RTO) and the Recovery - The Business Continuity Plan - DO.4.2
Point Objective (RPO) for the critical systems are (BCP) and the Disaster
defined (e.g., backup frequency, location of backup Recovery (DR) Plan and the
files, storage medium and scope). Processes.

Maturity Questions – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR) and a
DO.MQ.3
defined Business Continuity Plan (BCP) for the Data?

Related
Level Name Level Description Acceptance Evidence
Specification

- A Document of Actions
Required to Implement
Database Changes and
Rollbacks.

Level 3: - The Entity is implementing its defined Business All level 2 acceptance Evidence
Activate Continuity Plan (BCP), practices and processes for requirements, Including:
d business continuity, Data backup and recovery.
- The Technical Design Document.
- The BCP is implemented for the critical systems.

- Role-based access control for production data, all - The BCP Run Report.
RTO / RPO details (High availability, load balancing,
etc.) are agreed upon with Business and IT
stakeholders and implemented. - The Change Request for
Production Data.

- The Incident response teams are activated.

- The Production Data Access - DO.4.3


Control Document.

Level 4: - The entity is monitoring the practices pertaining to the All level 3 acceptance Evidence
Manage BCP and DR with pre-defined Key Performance Indicators requirements, Including:
d (KPIs).
- The Monitoring Report with
Pre- defined KPIs for the
Business Continuity (BCP)
and Disaster Recovery (DR).

Level - The Entity continuously reviews Data storage backup & All level 4 acceptance Evidence
5: recovery and disaster recovery processes for requirements, Including:
Pionee

49
r improvement and optimization. - The Continuous Improvement
Report on the Business
- The Entity performs periodic executions and validations Continuity Plan (BCP).
of the RTO and RPO to ensure that the systems are
well prepared for any unexpected interruption or
calamity.

- The BCP details are regularly reviewed to


incorporate the learnings, modern trends and
business needs / demands.

50
8.1.5. Document and Content Management Domain

Maturity Questions – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - The Entity doesn’t have any plan in place currently for - Not Applicable.
Absence of Document and Content Management (DCM).
Capabilities

Level 1: - The Entity manages documents and content on a - Evidences of Existing DCM
Establishing reactive or ad hoc basis without formalized DCM Activities.
practices.

Level - The Entity has defined plans to manage the Entity's - The DCM Plan. - DCM.1.1
2: documents and content lifecycle and digitization
Define activities. A roadmap has been defined with a list of
d activities, required resources, trainings, awareness and - The DCM Digitization Plan. - DCM.1.2
a budget to manage both DCM and Digitization.

- The Entity has defined processes for prioritizing and


- The Developed DCM Training
digitizing documents to be stored in the Entity's
Plan.
Document Management System (DMS).

- The Entity has identified and prioritized the key


- The Developed
processes to be implemented as workflows in DMS to
Prioritization Process for
enable automated and paperless management of
Documents and
documents.
Content.

- The Identified DCM Processes - DCM.1.4


/ Procedures.

- The Ranked List of Prioritized - DCM.1.4


Document Workflows to be
Implemented.

Level 3: - The Entity is implementing its DCM plan and All level 2 Acceptance Evidences
Activate Digitization plan including training and awareness required including:
d activities.
- The DCM Plan
- The Entity is prioritizing its documents to be Implementation Status
stored in the Entity's Document Management Report.
System (DMS).

- The Digitization Plan


- Roles and responsibilities are defined and initiated to Implementation Status
manage the DCM activities and the Digitization Report.
activities in line with the overall "Data Management

51
Organization" structure.

Maturity Questions – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Related
Level Name Level Description Acceptance Evidence
Specification

- The List of Prioritized - DCM.1.3


Documents for Digitization.

- The DCM Roles and


Responsibilities.

- The DCM Training Plan - DCM.3.1


Implementation Status
Report.

Level 4: - The Entity is monitoring the effectiveness of the DCM All level 3 Acceptance Evidences
Manage plan and the DCM Digitization plan with pre-defined Key required including:
d Performance Indicators (KPIs).
- The Monitoring Reports with
Pre- defined KPIs for the
Implementation of the DCM
Plan & the DCM Digitization
Plan.

Level - The Entity continuously reviews and updates both its All level 4 Acceptance Evidences
5: DCM plan and Digitization plan for improvement and required including:
Pionee optimization in line with the changing business needs.
r - The Continuous Improvement
Report of the DCM and
Digitization plans.

Maturity Questions – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup
DCM.MQ.2
& Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No policies and processes are in place currently for - Not Applicable.
Absence of Document and Content Management (DCM).
Capabilities

Level 1: - The Entity manages its documents and content on a - Evidences of Processes for
Establishing reactive and ad hoc basis without formalized DCM Retaining & Disposing of
policies and processes. Documents.

52
Maturity Questions – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup
DCM.MQ.2
& Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - The Entity defined a clear set of policies and - The Developed Policy - DCM.2.1
2: processes to manage its documents and content. Document for the DCM
Define Lifecycle.
d - The Processes are defined for:
- The Developed DCM Backup - DCM.4.1
 Backup & Recovery. & Recovery Procedures.

 Retention & Disposal.

 Document and Content Access Approvals.


- The Developed DCM Retention - DCM.4.2
 Metadata Publishing. & Disposal Procedures.

- The Developed DCM Role-Based - DCM.4.3


Access Approval Procedures.

- The Developed DCM Metadata


Publishing Procedures.

Level 3: - The Entity is implementing its DCM processes aligned All level 2 Acceptance Evidences - DCM.4.2
Activate with the policies. required including:
d
- The Document Management System (DMS) is - The Implemented Workflow
included in the Entity's Backup & Recovery plan. for the DCM Retention &
Disposal Process.
- Documents are transferred to the Entity's archival
facility (Archival Register).
- The Implemented Workflow - DCM.4.3
- The physical destruction and deletion of the documents for the DCM Access
is done securely. Approval Process.

- The Entity established role-based access


authorizations to the Content Management System - Evidences of Document
(CMS) and the Document Management System (DMS). Transfers to the Archival
Facility (Archival Register).
- Document and content Metadata is published via an
automated tool.

- The Documents Disposal


- Periodic DCM audits are conducted to ensure that Register.
the right information is getting to the right people
at the right time.

- The Access Rights


Documentation.

Maturity Questions – Document and Content Management Domain

53
Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup
DCM.MQ.2
& Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Report on Document & - DCM.4.4


Content Metadata Publishing.

Level 4: - The Entity is monitoring the DCM processes aligned All level 3 Acceptance Evidences - DCM.5.1
Manage with the policies with pre-defined Key Performance required including:
d Indicators (KPIs).
- The Monitoring Report with
- The Audit Trail / Audit Framework is tracked based Pre- defined Key
on pre- defined Key Performance Indicators (KPIs). Performance Indicators
(KPIs) of the DCM Processes
Aligned with the Policies.

Level - The Entity regularly reviews and updates its DCM All level 4 Acceptance Evidences
5: processes for continuous improvement and optimization. required including:
Pionee
r - The Continuous
Improvement Report on the
DCM Processes and
Practices.

Maturity Questions – Document and Content Management Domain

Has the Entity implemented a tool to support Document and Content Management (DCM) processes including Digitization
DCM.MQ.3
Management implementation?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No tools are in place currently for managing documents - Not Applicable.
Absence of nor content.
Capabilities

Level 1: - The Entity manages documents and content in disparate - Evidences of Document
Establishing applications. Storage / Retention &
Disposal Activities.

Level - The Entity identified and developed or acquired the - The documented DCM tool
2: required tool to support DCM. requirements.
Define
d

Level 3: - The Entity implemented the selected tool for Document All level 2 Acceptance Evidences - DCM.4.5
Activate and Content Management (DCM) which includes: required including:
d

54
Maturity Questions – Document and Content Management Domain

Has the Entity implemented a tool to support Document and Content Management (DCM) processes including Digitization
DCM.MQ.3
Management implementation?

Related
Level Name Level Description Acceptance Evidence
Specification
 A Document Management System (DMS). - The Implemented Tool for DCM.

 A Web Content Management System


(CMS).

 A collaboration tool.

- The Record of Digitized


Documents.

- The List of Approved Users.

Level 4: - The Entity is monitoring the implementation of the tool All level 3 Acceptance Evidences
Manage based on pre-defined Key Performance Indicators (KPIs). required including:
d
- The Monitoring Report on the
Implemented Tool with Pre-
defined Key Performance
Indicators (KPIs).

Level - For Continuous Improvement, the Entity regularly reviews: All level 4 Acceptance Evidences
5: required including:
Pionee  The performance of the
r implemented Digitization - The Continuous
Management tool to ensure Improvement Report on the
automation. DCM Solution / Tool Design
and DCM Practices.
 The DCM activities to ensure optimization.

55
8.1.6. Data Architecture and Modelling Domain

Maturity Questions – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place for Data Architecture & Modelling (DAM). - Not Applicable.
Absence of
Capabilities

Level 1: - The Entity manages its Data Architecture & Modelling - The existing DAM Domain related
Establishing (DAM) related activities on a reactive and ad hoc basis practices.
without formalized practices.

Level - The Entity has a defined & approved DAM Management - The Approved Current State - DAM.3.1
2: plan. Data Architecture & the
Define Existing Technical
d - The Entity developed an Entity-wide baseline current Architecture.
state and target future state Data Architecture in
accordance with the Enterprise Architecture (EA).
- The Approved Target State Data - DAM.3.2
Architecture.
- The Entity performed a gap analysis
resulting in an implementation roadmap
with defined priorities.
- The Future State Gap - DAM.3.3
Assessment.
- The Entity adopted a Standard enterprise architecture
framework.
- The Approved DAM Plan. - DAM.1.1
- The Entity published its DAM Policies & Guidelines.

- The Approved Enterprise


Architecture Framework.

- The DAM Policy. - DAM.2.1

56
Maturity Questions – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: - The Entity is implementing the defined implementation All level 2 acceptance evidence
Activated roadmap for its Target State Data Architecture. requirements including:

- The IT architecture is ensuring all business architecture - The DAM Plan Implementation
outcomes are addressed by selecting and implementing Progress Report.
the right technical capabilities.

- The Business Architecture team works closely with the


Entity's Enterprise Architecture team to align the
applicable Standards (e.g., DMBOK, TOGAF, etc.).

- The EA Framework
Implementation Report.

Level 4: - The Entity is monitoring the effectiveness of the DAM Plan & All level 3 acceptance evidence - DAM.6.1
Manage activities with pre-defined Key Performance Indicators requirements including:
d (KPIs).
- The Monitoring Report of the
DAM Plan & Activities
Implementation with Pre-
defined KPIs.

Level - The Entity follows an identified architecture Change All level 4 acceptance evidence
5: Management process to review, approve and implement requirements including:
Pionee the changes to the Current and / or Target State Data
r Architectures. - The Data Architecture Plan's
Continuous Improvement Report.
- The Entity continuously reviews the Target State Data
Architecture via checkpoints incorporated into the
Software Development Lifecycle (SDLC) process. - The Architecture Change - DAM.5.1
Management Process.
- The Entity regularly reviews and refines the Target State
roadmap, priorities, and architecture artifacts, reflecting
the current Enterprise Architecture strategy and key
- The Change Control Document.
initiatives.

- The Data Architecture - DAM.5.2


Checkpoints Report.

57
Maturity Questions – Data Architecture and Modelling Domain

Has the Entity developed and implemented practices for Data Architecture & Modelling (DAM) activities (including Data Flows,
DAM.MQ.2
Data Models and Governance considerations)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices are in place within the Entity for Data - Not Applicable.
Absence of Architecture & Modelling (DAM).
Capabilities

Level 1: - The Entity maps Data Flows to the business processes - The Current DAM Domain
Establishing only on a reactive basis for specific projects / initiatives. Activities.

- Foundational Data Flows are identified, defined and


designed but primarily driven by a project / an initiative.

Level - The Entity has identified & defined the - A document containing the
2: requirements for developing a Data Lake Approved Business Processes
Define environment. on the Data Architecture and
d any Related Data Flow.

- The Entity has defined a Partitioning Strategy for the - A document containing the - DAM.3.4
target state data architecture. Big Data Considerations
Including the Data Lake
- The Entity identified and mapped its Data Flows to the Requirements.
business processes.

- A document containing the - DAM.3.5


- The Entity's Enterprise Data Model adopts a Standard Data Processing
diagramming method (e.g. UML) for documenting the Considerations Including the
structure, relationships and notations of business entities Partitioning Strategy.
at the conceptual, logical and physical levels.

- Model representation. - DAM.4.1


- The Entity stores in a register, its Data & Technical
Architecture projects, Reference documentation
materials and Data Model designs.
- The DAM Register. - DAM.7.1

58
- The end-to-end Data Modelling lifecycle is defined with - A document containing the Data
required processes and best practices / guidelines such Model Representation's
as naming conventions, Data types, physical model Technical Standards & Best
deployment considerations and optimizations, etc. Practices.

Maturity Questions – Data Architecture and Modelling Domain

Has the Entity developed and implemented practices for Data Architecture & Modelling (DAM) activities (including Data Flows,
DAM.MQ.2
Data Models and Governance considerations)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: - The Entity is implementing the defined practices for All level 2 acceptance evidence
Activated its DAM activities. requirements including:

- Multiple Data integration and flow patterns are - The Data Integration Pattern
designed, published and are operating on defined Implementation Document.
Data architecture principles such as "Capture
Once Use Many" etc.

- Structured and streamlined information flows across the


Entity are done using the defined and published Data
integration patterns.

- The Entity is implementing tools and technologies


for DAM initiatives.

- Data Models are uploaded in the Data Catalog.

- Evidences of DAM Tools & - DAM.4.2


- The Logical Model is linked with business glossaries and
Technologies in Use.
technical attributes to provide end-to-end lineage and
activate the impact analysis.

- Evidences of the Enterprise


Data Model Uploaded in the
Data Catalog.
- The Entity's Enterprise Data Model is used
throughout the Software Development Lifecycle
(SDLC).
- Evidences of Applying the
Technical Data Standards.
- The development is done via a Data Model tool.

Level 4: - The Entity is monitoring the implementation of the DAM All level 3 acceptance evidence
Manage practices with pre-defined Key Performance Indicators requirements including:
d (KPIs).

59
- The Monitoring Report of the
Implementation of the DAM
Practices with Pre-defined
KPIs.

Level - The mapping of the Entity's Data Flow to the Business All level 4 acceptance evidence
5: processes is continuously monitored, measured, and requirements including:
Pionee improved including:
r - The Continuous Improvement
 Multi-latency, multi-format Data Flows Report Including the DAM
& Architecture Patterns defined, Business Process
adopted, and mapped to the business Documentation & Other
processes. Related Data Flow
Documentation.
 Entity-wide Data structures and Data
services mapped to the business
processes.

 Entity-wide Data structures and Data


services shared in real-time.

60
8.1.7. Data Sharing & Interoperability Domain

Maturity Questions – Data Sharing & Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place for Data Sharing nor Integration. - Not Applicable.
Absence of
Capabilities

Level 1: - The Entity shares Data on a reactive or ad hoc basis. - The Names of the Current
Establishing Practices of Data Sharing and
- Manual Data exchange practices exist within the Entity Integration (DSI).
and with other Entities.

- A DSI practice and initial data integration assessment of - The Updated and Approved - DSI.1.1
the Entity's current state has been conducted to Results of the Initial Data
identify pain points and inefficiencies in the data Integration Assessment.
transfer and the integration across the Entity.

Level - A Target Data Integration Architecture was developed All level 1 acceptance evidence
2: based on the identified pain points. requirements including:
Define
d - The Entity has developed an integration strategy and - The Developed Data Integration
implementation plan for data sharing and integration to Strategy Document.
remediate the identified pain points and manage the
required integration initiatives.
- The Developed Target Data - DSI.1.2
- The Data Sharing policies and guidelines are defined Integration Architecture.
for DSI within and outside the Entity in line with the
Data Sharing Regulations published by NDMO-
SDAIA.

- The Developed Data - DSI.1.3


- The Entity has developed a plan for training every
Integration Plan (Including
employee involved in the Data Sharing initiatives.
Data Sharing Activities).

- The Developed Data Sharing


Policies.

61
Maturity Questions – Data Sharing & Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Developed Data Sharing


Training Plan.

Level 3: - The Entity is implementing the defined plan and roadmap All level 2 acceptance evidence
Activated for data sharing and integration. requirements including:

- Roles and responsibilities are assigned within the Entity - A Data Integration Plan
to manage Data Sharing requests and Data integration Implementation Status Report.
activities in line with both the Data Management
Organization structure and the Data Sharing
regulations published by NDMO-SDAIA. - A Report on the Defined Roles
and Responsibilities for DSI.
- The Entity is conducting Data Sharing training in line
with the Data Sharing training plan.

- A Progress Report on Data - DSI.2.1


Sharing Training Programs.

Level 4: - The Entity is monitoring the implementation of the DSI All level 3 acceptance evidence
Manage activities to ensure completion as per the plan with pre- requirements including:
d defined Key Performance Indicators (KPIs).
- The Monitoring Report with
Pre- defined Key Performance
Indicators (KPIs) for the Data
Integration Plan and Data
Sharing Activities.

Level - The DSI plan and the activities are continuously optimized All level 4 acceptance evidence
5: to facilitate innovation and to maintain compliance with requirements including:
Pionee the Data Sharing Policies.
r - The Continuous Improvement
Report on the Data Integration
Plan and Data Sharing
Activities.

62
Maturity Questions – Data Sharing & Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No processes are in place for sharing the Entity's Data - Not Applicable.
Absence of internally nor externally.
Capabilities

Level 1: - The Entity shares Data on a reactive or ad hoc basis - Evidences of the Current DSI
Establishing within and outside the Entity with no formalized Practices / Processes by the
practices. Entity (Internally & Externally).

Level - The Entity defined and approved a standardized Data - The Process Documentation - DSI.5.1
2: Sharing mechanism for sharing Data internally and for Data Sharing (Including
Define externally according to the defined classification levels. Data Classification Levels and
d The mechanism includes forms, processes and Timelines).
agreements to be used to manage the Data Sharing
requests in line with the Data Sharing principles and
regulation. - The Developed and Approved
Data Sharing Request Forms
- Data Sharing request forms have been developed and (Internal and External).
are shared with Data requestors as needed.

- The Data Sharing agreements have been developed for


- The Developed and Approved - DSI.7.1
Internal and External Data Sharing.
Internal Data Sharing
Agreement Template.

- The Developed and Approved - DSI.7.2


External Data Sharing
Agreement Template.

Level 3: - The Entity is implementing and operationalizing the All level 2 acceptance evidence - DSI.5.1
Activated defined and approved Data Sharing mechanism to requirements including:
facilitate Data Sharing within and outside the Entity.
- Evidences of
- Data is shared with Entities only through SDAIA's Operationalization of The
certified and approved channels (e.g., The Data Sharing Process, e.g.:
Government Service Bus (GSB), National Information Communication of The
Center Network, etc.). Defined Data Sharing
Mechanism.

63
Maturity Questions – Data Sharing & Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity put in place controls to ensure that only the
appropriate authority is allowed to access, obtain and - Evidences of Data Sharing
use the shared Data based on the nature and Through SDAIA's Certified and
sensitivity of the Data. Approved Channels.

- All ongoing Data Sharing agreements are reviewed on a


regular basis to accommodate for any changes.
- The Access Authorization
Controls Document.
- The Entity's official Government website has an
established channel to manage Data Sharing
requests.

- Evidences of Data Sharing - DSI.6.1


Requests Submitted to The
Entity and The Requests
Submitted by The Entity
Through the Established
Channel.

- Evidences of the Entity's


Responses to the Data
Sharing Requests.

- An Adoption Evidence of The - DSI.7.1


Developed Data Sharing
Agreement Template for An
Internal Data Sharing Request.

- An Adoption Evidence of The - DSI.7.2


Developed Data Sharing
Agreement Template for An
External Data Sharing
Request.

- The Documented Review - DSI.7.3


Outcomes of The Data
Sharing Agreements.

64
Level 4: - The entity is measuring the efficiency of Data Sharing All level 3 acceptance evidence - DSI.8.1
Manage activities, tracking progress, and ensuring compliance requirements including: (3,4,5,6,7)
d with pre-defined KPIs.
- The Monitoring Report with Pre-
defined KPIs for the Data
Sharing Processes.

Maturity Questions – Data Sharing & Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Related
Level Name Level Description Acceptance Evidence
Specification

- The compliance audit


methodology.

Level - The Entity's Data Sharing process is automated and All level 4 acceptance evidence
5: updated frequently. requirements including:
Pionee
r - Data Sharing practices are regularly reviewed and - The Continuous Improvement
adjusted to address the changing business objectives Report on the DSI Processes.
while implementing continuous improvements.

- Data-as-a-service (DaaS) and Data products are


developed and maintained proactively by the Entity.

- DSI is fully automated and scalable with full control on


protection and privacy at the Entity level.

Maturity Questions – Data Sharing & Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across
DSI.MQ.3
Data stores, systems, and applications?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices in place to manage the Data movement - Not Applicable.


Absence of across the entity's systems and applications.
Capabilities

Level 1: - The data movement across Data stores, systems, and - Evidences of Data
Movement
Establishin applications is done on ad-hoc basis with no formal
g architecture.

65
/ Integration within the
Entity and with Other
Entities.

Level - The Entity defines its business requirements for each - The Integration - DSI.3.1
2: initiative requiring data integration between Data Requirements Document.
Define stores, systems, and applications.
d
- The Entity develops a solution design document for
each data integration initiative based on the target - The Solution Design - DSI.3.2

data integration architecture and the Integration Document.

Requirements Document.

Maturity Questions – Data Sharing & Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across
DSI.MQ.3
Data stores, systems, and applications?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: - The Entity is implementing the Data integration All level 2 acceptance
Activated architecture following an integration solution evidence requirements
development lifecycle for each data integration including:
initiative.
- The Document of the
- The entity performs testing of the developed integration Approved Standardized
solution before deployment to the production Entity Wide Integration
environment. Solution Development
Lifecycle.

- The Developed Test - DSI.3.3


- Data sources are integrated through established and
Scripts and the
approved integration patterns.
Conducted Tests
(Integration, Functional)
in Line with the Plan and
the Solution Design
Document.

66
Level 4: - The Entity is monitoring the effectiveness of the All level 3 acceptance - DSI.8.1 (1,2)
Manage integration initiatives with pre-defined KPIs. evidence requirements
d including:

- The Monitoring Report


with Pre-defined KPIs for
the Data Integration
Initiatives.

- The Integration Solution - DSI.3.4


Monitoring and
Maintenance Document.

Level - The Entity reviews the integration architecture regularly All level 4 acceptance
5: and optimizes the integrations practices by adopting evidence requirements
Pionee cutting edge methodologies (e.g., Implementing including:
r Continuous Integration and Continuous Deployment
(CI/CD) pipelines) to automate and streamline the - The Continuous

improvement of the integration solution. Improvement Report on


the Data Integration
Practices.

Maturity Questions – Data Sharing & Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across
DSI.MQ.3
Data stores, systems, and applications?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Continuous
Improvement
Mechanisms for Data
Integration, e.g.:
Continuous Integration &
Continuous Delivery
(CICD) Pipeline Details for
Automation.

Maturity Questions – Data Sharing & Interoperability Domain

67
Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation
DSI.MQ.4
and movement?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No controls or processes in place for data sharing or data - Not Applicable.
Absence of transformation and movement.
Capabilities

Level 1: - The Entity implements controls for data sharing and - The Existing Data Integration
Establishing moves data across applications on an ad-hoc basis or Data Movement Processes
without formalized controls or processes. within the Entity.

Level - The Entity has defined data sharing controls, processes - The Developed Data Migration - DSI.4.1
2: and standards for Data transformation and Data Processes (i.e.: ETL).
Define movement across the Entity's applications.
d
- The Entity has conducted a risk assessment of DSI - The Developed Data Migration - DSI.4.2
processes.
Processes (i.e.: ELT).

- The Entity has defined specific processes to classify,


protect and support Data privacy.
- The Risk Assessment Report on
the Entity's Datasets to be
Shared.

- The Defined Data Sharing


Controls.

Maturity Questions – Data Sharing & Interoperability Domain

Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation
DSI.MQ.4
and movement?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: All level 2 acceptance evidence


Activated requirements including:
- The Entity is implementing defined processes for data
transformation and movement (ETL/ELT), and controls for - A Report on the
Data Sharing. This includes processes for integrating Implemented Controls
data from disparate sources and loading into Data (e.g.: Data Security &
Warehouse Store, and processes for storing unstructured Protection, Data Sharing,
data in its raw native format in the Data Lake. Data Integration, Data
Access, etc.).

68
- The Entity is implementing standards for Data
development, transformation and movement across - Evidence of the Implemented - DSI.4.1
all applications. Data Migration Processes
(i.e.: “Extract, Transform,
- The Data security controls are implemented in line with Load (ETL).
the security controls published by the KSA's National
Cybersecurity Authority (NCA).
- Evidence of the Implemented - DSI.4.2
Data Migration Processes
- Appropriate controls are applied on the Datasets to be
(i.e.: “Extract, Load,
shared in line with the Data Sharing regulations and
Transform (ELT)).
other relevant regulations (e.g., Personal Data
Protection (PDP) etc.).

Level 4: - The Entity is monitoring the effectiveness of the DSI All level 3 acceptance evidence
Manage controls, processes and standards with pre-defined KPIs. requirements including:
d
- The Monitoring Report with Pre-
defined KPIs for the DSI Controls,
Processes and Standards.

Level - The Entity continuously reviews its Data sharing and All level 4 acceptance evidence
5: integration controls and practices for optimization by requirements including:
Pionee adopting advanced interoperability standards (e.g., Event
r driven, event sourcing, Data mesh, etc.). - The Continuous Improvement
Report on Reviewing the DSI
- The transformed and merged Data is available upon Processes and Mechanisms to
request as a service (DaaS) for various transactions, Automate the DSI Controls
reporting and analysis in the Entity for innovation (e.g., and Practices.
Data-driven transformation and products, etc.).

- Proactive periodic audits and reviews of the


applications and systems are performed in an
automated way.

69
8.1.8. Reference and Master Data Management Domain

Maturity Questions – Reference and Master Data Management Domain

Has the Entity developed and implemented a plan focused on improving its Reference & Master Data (RMD) Management
RMD.MQ.1
capabilities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - The Entity doesn't have a plan in place for Reference & - Not Applicable.
Absence of Master Data (RMD) Management.
Capabilities

Level 1: - The Entity manages its Reference & Master Data (RMD) - The Currently Existing Practices
Establishing on a reactive or ad hoc basis without formalized Related to the RMD
practices. Management Domain.

Level - The Entity has a defined and approved RMD Management - The Developed and Approved - RMD.1.1
plan.
2: RMD Management Plan.
Define
- A roadmap is defined with a list of activities, required
d
resources, trainings and a budget to manage the RMD
plan implementation.

Level 3: - The Entity is implementing the defined RMD All level 2 acceptance evidence
Activated Management plan and roadmap. requirements including:

- Clear roles and responsibilities are defined and - The RMD Management Plan
activated for the RMD activities. Implementation Status Report.

- Business and IT Data stewards are assigned in line with


the "Data Management Organization" structure. - The RMD Management Training - RMD.3.1
Implementation Status Report.
- Training is being conducted for all Entity's employees
responsible for the management of its RMD as required
in line with the change management practices. - The RMD Management - RMD.4.1
Operating Model Showing the
- The RMD change request logs and the RMD Stewardship Coverage.
decisions are documented.

- The documented RMD Management initiatives - The RMD Change Request Logs. - RMD.6.1
registered in the statement of Architecture work are
being implemented.
- The RMD Management - RMD.6.2
Documents & Artifacts.

70
Maturity Questions – Reference and Master Data Management Domain

Has the Entity developed and implemented a plan focused on improving its Reference & Master Data (RMD) Management
RMD.MQ.1
capabilities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring the effectiveness of the RMD All level 3 acceptance Evidence
Manage Management plan and activities with pre-defined KPIs. requirements, Including:
d
- The Monitoring Report of the
RMD Management Plan
Implementation with Pre-
defined KPIs.

Level - The Entity's RMD Management plan and activities are All level 4 acceptance Evidence
5: continuously reviewed and updated for optimization and requirements, Including:
Pionee improvement.
r - The Continuous Improvement
Report of the RMD
Management Plan.

Maturity Questions – Reference and Master Data Management Domain

Has the Entity defined and implemented processes to manage its Reference & Master Data (RMD) objects from creation to
RMD.MQ.2
archival?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No processes are in place for Reference & Master Data - Not Applicable.
Absence of (RMD) management.
Capabilities

Level 1: - The Entity manages its RMD on a reactive or ad hoc basis - Evidence of Projects /
Establishing (e.g.: Projects / initiatives) without formalized practices. Initiatives with Reference
Data and/or Master Data
Identified.

Level - The Entity has defined processes to manage its RMD - The Identified, Prioritized and - RMD.1.2
2: objects across the business and technical Categorized RMD Objects.
Define applications.

d
- RMD requirements are documented in line with business - Reference Data Categorization. - RMD.1.3
needs.

71
Maturity Questions – Reference and Master Data Management Domain

Has the Entity defined and implemented processes to manage its Reference & Master Data (RMD) objects from creation to
RMD.MQ.2
archival?

Related
Level Name Level Description Acceptance Evidence
Specification
- RMD objects are identified, prioritized, and categorized
(Internal or External). - Master Data Categorization. - RMD.1.4

- Standards and rules for matching and merging


Master Data Objects and Reference Datasets are - The Reference & Master Data - RMD.2.1

defined. Requirements.

- Processes are developed for the RMD lifecycle


(Creation, modification, and archiving). - The Defined SLAs for RMD - RMD.5.1
Lifecycle Management.
- Service Level Agreements (SLAs) are defined for the
processes.

Level 3: - The Entity is implementing its RMD Management All level 2 acceptance evidence - RMD.4.2
Processes.
Activated requirements including:

- Match and Merge, survivorship rules for Master Data


- The RMD Lifecycle Management
Objects are embedded in the Data transformation
Process
rules and implemented on the Entity's Data
applications.

- Rules for standardized Reference Data are


- Evidence of the Implementation
embedded in the Entity's Data applications and
& Adoption of the National
managed centrally.
Reference Datasets.

Level 4: - The Entity is monitoring the processes for managing the All level 3 acceptance Evidence - RMD.5.2
Manage RMD Objects across the Data lifecycle with pre-defined requirements, Including:
d KPIs.
- The Monitoring Report of the
RMD Management Processes
with Pre- defined KPIs & SLAs.

Level - The Entity's RMD Management Processes and Standards All level 4 acceptance Evidence
5: are continuously monitored for optimization and requirements, Including:
Pionee improvement.

r - The Continuous Improvement


Report of the RMD
Management Processes &
Standards.

72
Maturity Questions – Reference and Master Data Management Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No tools are in place for managing the Reference & Master - Not Applicable
Absence of Data (RMD).
Capabilities

Level 1: - The Entity manages its RMD within standalone Data - Evidence of Reference & Master
Establishing sources without standardized tools. Data (RMD) Used for Project
Purposes.

Level - The Entity identified the required Hub / Tool for RMD - The Target RMD Management - RMD.2.2
2: Management based on business and technical Architecture Design.
Define requirements.

- The Developed RMD Conceptual - RMD.2.3


- A suitable Data Hub architecture implementation
Architecture.
pattern is agreed upon and the Data Hub design is
developed.

- A conceptual and information architecture is


developed based on the selected Data Hub
Architecture Design.
- The Developed RMD Information - RMD.2.4
Architecture.
- RMD models and architectures are developed.

- RMD Hub / Tool technical requirements are documented


based on the defined target RMD information
Architecture.
- The RMD Hub / Tool Technical - RMD.2.5
Requirements.

Level 3: - The Entity implemented the selected Data Hub for RMD All level 2 acceptance Evidence - RMD.4.3
Activate Management and is operating the hub as the Trusted requirements, Including:
d Source of RMD across the Entity.
- The Implemented RMD
- The implemented Data Hub is flexible to Management Hub.
accommodate new Data sources and to support
workflow capabilities and customizations (e.g., Data
localization, privacy, consent

73
Maturity Questions – Reference and Master Data Management Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Related
Level Name Level Description Acceptance Evidence
Specification
management, hierarchy management, and exception
processing and automated alert capabilities). - The Workflow Documentation - RMD.4.4
Showing the Establishment of
the Data Hub as the Entity's
Trusted Source.

Level 4: - The Entity is monitoring the adoption and usage of the All level 3 acceptance Evidence
Manage implemented Data Hub with the pre-defined KPIs. requirements, Including:
d
- The Monitoring Report of the RMD
Management Hub / Tool
Capabilities with Pre-defined Key
Performance Indicators (KPIs).

Level - The Entity continuously inspects the implemented Data All level 4 acceptance evidence
Hub to:
5: requirements, Including:
Pionee
 Ensure full coverage of all new and
r - The Continuous Improvement
updated Data sources.
Report of the RMD Information
 Ensure that all deployed tools and Architecture and the
technologies are regularly optimized. Implemented Data Hub.

 Ensure that the level of automation


increased based on industry and global
trends.

- A centralized Hub / Tool is developed being the Trusted


Data Source wherein all the RMD Objects are hosted
and maintained.

74
8.1.9. Business Intelligence and Analytics Domain

Maturity Questions – Business Intelligence and Analytics Domain

Has the Entity developed and implemented a plan to manage and orchestrate its Business Intelligence & Analytics (BIA)
BIA.MQ.1
activities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No formal Business Intelligence & Analytics (BIA) plan is - Not Applicable.
Absence of in place.
Capabilities

Level 1: - The Entity implements BIA activities on a reactive or ad - Current activities related to BIA.
Establishing hoc basis.

Level - The Entity has a defined and approved BIA plan based - The defined & approved BIA Plan. - BIA.1.1
2: on the overall Data Management (DM) strategy.
Define
d - A roadmap is defined with a list of activities, required
resources and budget to manage the BIA
implementation.

Level 3: - The Entity is implementing the defined & approved All level 2 acceptance evidence
Activated BIA plan & roadmap. requirements including:

- Clear roles and responsibilities are defined for the BIA - The BIA plan & roadmap
activities in line with the Data Management implementation status report.
Organization structure.

- The Defined & Documented Roles


& Responsibilities for BIA
Activities Including Data
Stewardship Roles.

Level 4: - The Entity is monitoring the effectiveness of the BIA All level 3 acceptance evidence
Manage plan and activities with pre-defined Key Performance requirements including:
d Indicators (KPIs).
- The Effectiveness Monitoring
Report of the BIA plan &
activities with pre- defined Key
Performance Indicators (KPIs).

75
Maturity Questions – Business Intelligence and Analytics Domain

Has the Entity developed and implemented a plan to manage and orchestrate its Business Intelligence & Analytics (BIA)
BIA.MQ.1
activities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - The BIA plan and activities are continuously reviewed All level 4 acceptance evidence
5: and updated for optimization and improvement. requirements including:
Pionee
r - Regular reviews and
improvements to the business
intelligence and analytics plan
and roadmap.

Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No defined BIA use cases are identified within the Entity. - Not Applicable.
Absence of
Capabilities

Level 1: - The Entity identifies and implements BIA use cases on ad - List of the implemented BIA uses
Establishing hoc basis or for specific projects and initiatives. cases within the Entity.

Level - The Entity has identified and prioritized its BIA use cases - The approved BIA business cases
2: using a defined prioritization framework. / BIA use cases.
Define
d - Shortlisted use cases have been documented in a
detailed BIA Use Case Portfolio document. - The approved BIA Use Cases - BIA.1.2
Prioritization Framework.
- An Implementation plan has been developed for each
shortlisted and approved use case with implementation
priorities based on the Entity's defined use case - The shortlisted / prioritized use - BIA.1.2
implementation approach. cases based on business
needs.
- A use case validation process has been developed.

- The BIA Use Case Portfolio - BIA.1.3


document with details of each
use case.

76
- The Approved Use Case - BIA.1.4
implementation plan.

Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Related
Level Name Level Description Acceptance Evidence
Specification

- The defined use case


implementation approach
(e.g., DevOps, Agile, etc.).

- The approved use case - BIA.3.1


validation process.

Level 3: - The Entity is implementing and validating the prioritized All level 2 acceptance evidence
Activate use cases in order to transform the Entity into a Data- requirements including:
d driven organization.
- The Implemented and new Use
- The new use cases are evaluated and implemented Cases.
based on the business requirements.

- The Implemented BIA use cases and their final - The up-to-date BIA use cases - BIA.5.1
outcomes are documented in a register. register.

- The Outcomes of the BIA Use


Case Validation Processes.

Level 4: - The Entity is evaluating / monitoring the performance of All level 3 acceptance evidence - BIA.4.1
Manage analytical models and business impacts of implemented requirements including:
d use cases with
pre-defined Key Performance Indicators (KPIs). - The Monitoring Report on the
BIA Portfolio Effectiveness with
- The Technical or Business impacts and the Return on pre- defined Key Performance
Investment (ROI) from the use cases are monitored with Indicators (KPIs).
pre-defined KPIs.

Level - The Entity's activated and implemented BIA use All level 4 acceptance evidence
5: cases are regularly reviewed and optimized. requirements including:
Pionee
r - The potential impact, competitive advantage, Total - Continuous Improvement
Cost of Ownership (TCO) and Return on Investment Report on BIA Use Cases.
(ROI) are continuously reviewed and updated with a

77
defined calculation methodology.
- The Updated BIA Use Case
Register and the Optimized
Use Cases.

Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices are in place for BIA. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity manages its BIA processes on a reactive or - The Existing BIA
Establishing ad hoc basis without formalized practices. Domain Processes &
Governance
- Reporting is done either on direct source systems; or Documentation.
different Data Marts are developed based on the
business needs with no semantic relationship / link
among them.
- The List of Existing Reports &
Dashboards.
- The BIA Governance focuses only on the
requirements of the projects.

- The DQ issues are resolved during the Data population


for BI by the BI team.

Level - The Entity has defined formal BIA practices including - The Developed & Approved
2: the development of processes for a Data warehouse or Processes of BIA Management.
Define a Data Lake with logical models in place to model the
d business functions.

- The Approved Process of New


- A Semantic layer is developed and maintained for
Data Source Requirements.
supporting BI and Advanced Analytics use cases.

- The Entity is developing a change management plan


- The Approved Demand
including training programs, awareness campaigns
Management Process.
and a release management process to guide the

78
publishing mechanism of reports, dashboards and
implemented uses cases. - Development and maintenance
document of the Semantic
Layer.

Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Related
Level Name Level Description Acceptance Evidence
Specification

- Advanced analytics
management and governance
process.

- The identified & developed


Data Sources & Data Marts.

- The Developed & - BIA.2.1


Approved Change
Management Plan
including:

 The Training
programs.

- The Developed & - BIA.2.2


Approved Change
Management Plan
including:

 Awareness campaigns.

Level 3: - The Entity is implementing the defined BIA All level 2 acceptance evidence
Activated processes and practices which are established to requirements including:
govern BIA activities.
- Evidences of the adoption &
- Self-service analytics are fully activated covering implementation for business
both BI and Artificial Intelligence / Machine Learning intelligence and analytics
(AI/ML). management and governance.

79
- Business users have full access to all types of - The approved Operating Model - BIA.3.2
Data (Raw, Processed, Modelled) in a governed with defined roles &
manner. responsibilities for The Data
Science team.
- End-to-end governance is in place covering emerging
topics such as AI ethics, etc.

- Advanced Analytics (AI/ML) governance and - Evidences of training courses &

management (AI/ML OPs) foundational processes are in awareness campaigns

place operationally. conducted.

- A central Data Science & BIA team has been set up with
defined roles and responsibilities.

Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Related
Level Name Level Description Acceptance Evidence
Specification
- The central Data Science & BIA team has a clear process
flow to handle the Entity's BIA needs. - The approved User
Acceptance Test (UAT)
- BIA training courses are conducted for all employees Documents.
involved in the BIA initiatives to upskill the analytics
capabilities; and awareness campaigns are conducted to
- The approved outcomes as
promote the awareness, education & adoption of the BIA
Reports & Dashboards
capabilities.
produced for The Business
Units.

- The Capacity planning


document.

Level 4: - The Entity is monitoring BIA processes, governance All level 3 acceptance evidence
Manage and the performance of the centralized BI team with requirements including:
d pre-defined Key Performance Indicators (KPIs).
- The Monitoring Report on the
effectiveness of practices and
processes for managing and
governing business intelligence
and analytics through
predefined key performance
indicators (KPIs).

- The monitoring Report on the - BIA.4.1


effectiveness of training and

80
awareness sessions
conducted through
predefined key performance
indicators (KPIs).

- Performance Measurement
Report for the Business
Intelligence and Analytics
Team.

Level - The Entity created a BI Competency center including the All level 4 acceptance evidence
5: required roles, skills & capabilities to fulfil all BI requirements including:
Pionee requirements.

r - Establishment of Business
- The Entity created a BI Competency center which is Intelligence and Analytics
governing the Entity's BI management including the Center Document.
demand management and change management.

Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Related
Level Name Level Description Acceptance Evidence
Specification
- The performance of the BIA team is continuously
reviewed, measured and optimized. - The continuous review and
improvement of business
- The BIA Capacity planning is reviewed regularly and intelligence and analytics
fulfilled to support the growing demand. management and governance
practices.

- Continuous Review and


Improvement of Business
Intelligence and Analytics
Team Performance.

- The Updated Capacity Planning


document.

- The Revised Demand


Management Process.

81
Maturity Questions – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No technology for BIA is in place. - Not Applicable.


Absence of
Capabilities - The reporting is done Manually.

Level 1: - The Entity implements its BIA activities on an ad hoc basis - The list of BIA tools.
Establishing or individual applications without formalized tools.

- The list of reports.

Maturity Questions – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - The Entity selected standardized tools to develop All Level 1 acceptance evidence
2: standard analytics artifacts (reports and requirements including:
Define dashboards).

d - The list of business units which


- The Business and IT stakeholders collaborate to use the BIA technology tools.
develop, enhance and maintain the Semantic
Layer.

Level 3: - The Entity selected and implemented tools and All Level 2 acceptance evidence
Activate technologies that are being adopted across the Entity. requirements including:
d
- The Entity has self-service advanced analytics (Data - The list of users with role and
discovery, wrangling, statistical analysis, etc.) and privileges.
collaborative Artificial Intelligence / Machine Learning
(AI / ML) using a low-code / no- code environment.

82
- Users have the choice of a unified interface in their - The Approved
workflows by using: Architecture &
Documentation for
 Tools.
Advanced Analytics.

 Programming Languages (e.g., Python,


R, Go, etc.).

- Advanced Analytics and


 Platforms (e.g., Anaconda, Jupiter, etc.).
Communication Project
 Libraries (e.g., Pandas, Numpy, etc.). Management Documents.

- Project documentation is managed centrally and shared


across the Entity.

- Common AI / ML Models are versioned and documented


through the tools. - The Approved Advanced
Analytics Models Documents.
- Users have the ability to reuse those AI / ML Models in
developing new BI Analytics & Models.

Maturity Questions – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring the adoption and utilization of All Level 3 acceptance evidence
Manage the BIA tools and technologies with pre-defined Key requirements including:
d Performance Indicators (KPIs).
- Report on the Effectiveness
Monitoring of Business
Intelligence and Analytics
Tools and Techniques through
Predefined Key Performance
Indicators.

Level - New technologies are regularly evaluated, tested, and All Level 4 acceptance evidence
5: introduced to keep the capabilities aligned with the requirements including:
Pionee future needs.

r - Document for Continuous


- Advanced frameworks are used (e.g., Spark, Adoption of Technologies,
TensorFlow, PyTorch, etc.) and integrated with Tools, Frameworks, and
their unified AI/ML environment.

83
Features.
- Advanced capabilities such AI/ML engineering tools and
features are used for complex processing and
- The Continuous Improvement
reusability.
Report of the BIA and
Advanced Analytics
- Tools and technologies which use AI/ML OPs are
Technology Solutions
practiced to manage the end-to-end AI/ML lifecycle.
including the Proof of Concept
(POC) Roadmap.

- A Report on the Proof of


Concept (POC) Results.

84
8.1.10. Data Value Realization Domain

Maturity Questions – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and implement Data-
DVR.MQ.1
related cost optimization initiatives and use cases?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place to generate value from Data nor - Not Applicable.
Absence of implement Data-related cost optimization initiatives.
Capabilities
- No use cases are identified for Data revenue generation
nor Data- related cost optimization.

Level 1: - The Entity has existing practices for Data Value - The Existing List of Data Value
Establishing Realization (DVR), Data-related cost optimization and Realization (DVR) Activities.
identification of use cases, however, this is done in a
reactive or ad hoc basis without formalized DVR
practices.

Level - The Entity has a defined and approved plan to identify, - The Data Value Realization - DVR.1.2
2: document and realize value from its Data through (DVR) plan.
Define revenue generation initiatives and Data-related cost
d optimization initiatives.

- The List of Identified Use Cases - DVR.1.1


- The Entity identified use-cases for revenue generation for Both Revenue Generation &
and cost optimization based on the guidelines for Cost Optimization.
ethical use cases.

- The Entity estimated and projected a Payback period and


- A Document Explaining the - DVR.1.1
Return on Investment (ROI) for each identified use case.
Payback Period and Return on
Investment (ROI) for Each
- Roles and responsibilities to support the DVR
Identified Use Case.
implementation have been identified and
documented.

Level 3: - The Entity is implementing its DVR plan through use- All level 2 acceptance Evidence - DVR.3.1
Activated cases for revenue generation and cost optimization. requirements, Including:

- Roles and responsibilities for DVR use-case - Report on DVR Monitoring &
implementations are activated in line with the Data Maintenance.
Management Organization structure.

- The Entity is maintaining the DVR use cases.

Level 4: - The Entity is monitoring the efficiency of the plan All level 3 acceptance Evidence - DVR.4.1 (1,
Manage implementation and DVR activities with pre-defined Key requirements, Including: 2, 6, 7)
d Performance Indicators (KPIs) in line with the “DM and
PDP Standards” document. - The Monitoring Report of the
DVR use cases with pre-defined
KPIs.

85
Maturity Questions – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and implement Data-
DVR.MQ.1
related cost optimization initiatives and use cases?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Monitoring Report of the


DVR Activities & Plan with Pre-
defined KPIs.

Level - The DVR plan implementation is regularly reviewed and All level 4 acceptance evidence
5: optimized to ensure continuous improvement. including:
Pionee
r - Review & Continuous
Improvement Document of the
DVR Plan.

- The Revised & Updated DVR


KPIs.

- Evidences of New Partnerships


(e.g., MoU, jointly developed
use cases or products, etc.).

Maturity Questions – Data Value Realization Domain

DVR.MQ.2 Has the Entity implemented practices to support a Data revenue generation process?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No processes are in place for use cases of both: Data - Not Applicable.
Absence of revenue generation and cost optimization.
Capabilities

Level 1: - The Entity has not implemented a formalized process for - The Existing Practices Related to
Establishing Data revenue generation, however, this is done in a Supporting the Data Revenue
reactive or ad hoc basis. Generation Process.

Level - The Entity has a defined, documented, and approved - The Defined Pricing Scheme. - DVR.2.1
2: process for Data revenue generation covering the details
Define of:
d

Maturity Questions – Data Value Realization Domain

86
DVR.MQ.2 Has the Entity implemented practices to support a Data revenue generation process?

Related
Level Name Level Description Acceptance Evidence
Specification
 The selection of a pricing schema model.
- The Data or Data Product Price - DVR.2.2
 The calculation of the total cost.
Calculation.

 Aligning the adopted charging model in


line with the business needs.

- The Adopted / Approved - DVR.2.3


Charging Model for Each Data
or Data Product.

Level 3: - The Entity is implementing the defined process of Data All level 2 acceptance evidence - DVR.2.4
Activated revenue generation. including:

- For each Data revenue generation request, the Entity - Evidences of Revenue
submits the required documentation to NDMO-SDAIA. Generation Requests Submitted
to NDMO- SDAIA.
- For each Data or Data Product that the Entity is
expecting to generate revenue from, the Entity submits
a revenue generation request to NDMO-SDAIA.

Level 4: - The Entity is monitoring the efficiency of the Data revenue All level 3 acceptance evidence - DVR.4.1
Manage generation process with pre-defined KPIs. including: (3,4,5)
d
- The Monitoring Report of the
Data Revenue Generation
Process with Pre-defined
KPIs.

Level - The Revenue Generation Process is regularly reviewed and All level 4 acceptance evidence
5: optimized to ensure continuous improvement. including:
Pionee
r - Continuous Improvement
Report of the Data Revenue
Generation Process.

87
8.1.11. Open Data Domain

Maturity Questions – Open Data Domain

OD.MQ.1 Has the Entity defined, established, and implemented a plan to identify and coordinate the publishing of its Open Datasets?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No Open Data (OD) plan is in place. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity has existing practices to maintain OD, but these - The existing list of Open
Datasets.
Establishing practices are not formalized.

Level - The Entity has defined an OD plan and framework. - The approved Open Data
2: Framework.
Define - OD Management structure is in place and a change

d management plan has been developed to conduct


training on OD activities and awareness campaigns. - The approved Open Data plan. - OD.1.1

- The OD Management structure.

- The developed and approved


plan for change management
including:

 The OD training plan.

- The developed and approved - OD.2.1


plan for change management
including:

 The OD
awareness
campaigns plan.

Level 3: - The Entity is implementing the activities on the defined All level 2 acceptance evidence
Activated Open Data plan including training & awareness programs. requirements including:

- An annual report on the Open Data plan & progress is - The Open Data Plan

88
submitted to NDMO-SDAIA. Implementation status
report.

Maturity Questions – Open Data Domain

OD.MQ.1 Has the Entity defined, established, and implemented a plan to identify and coordinate the publishing of its Open Datasets?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity appointed the following roles in line with
the Data Management Organization: - Evidence of submission of
the annual Compliance
 Open Data & Information Access Officer report to NDMO-SDAIA.
(ODIAO) to lead the Open Data activities
within the Entity.
- Assignment decisions /
 Business Data Executive (BDE).
appointees to job roles.
 Business Data Steward.

- Evidence of Implementation of
the Change Management
Program (The conducted
training courses and the
launched awareness
campaigns related to Open
Data).

Level 4: - The Entity is monitoring the effectiveness of the Open All level 3 acceptance evidence - OD.4.1
Manage Data activities plan with pre-defined Key Performance requirements including:
d Indicators (KPIs).
- The Monitoring Report on the
Effectiveness of the Open
Data Plan through Predefined
Key Performance Indicators
(KPIs).

Level - The Open Data plan is revised & optimized for continuous All level 4 acceptance evidence
5: improvement. requirements including:
Pionee
r - The periodic reviews and
improvements of the open
data plan.

89
Maturity Questions – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No existing process is in place to identify Open Data (OD) - Not Applicable.
Absence of within the Entity.
Capabilities

Level 1: - Open Data identification practices are performed on a - The Existing Practices for OD
Establishing reactive or ad hoc basis, without formalized OD Identification.
practices.

Level - The Entity has developed and documented processes - The Defined Process - OD.3.1
2: required to manage the lifecycle of Open Data. Documentation for managing
Define the lifecycle of Open Data.
d - In alignment with the Open Data Policy, the Entity
defined a process to identify public Datasets to be
published including a mechanism to evaluate all Data - The Defined Process
classified as “Public Data” to determine whether to be Documentation for Identifying
published as OD or not. Open Data.

- The process of evaluating the


value and impact of open or
public datasets.

Level 3: - The Entity is implementing the defined processes for All level 2 acceptance evidence
Activate identifying the Datasets to be published. requirements including:
d
- The Entity is evaluating the identified datasets in terms - The OD Identification Process
of value whilst also conducting risk assessment to Implementation Status Report.
ensure there will be no security or privacy threats when
published.

- The Identified Open Datasets are being aligned - The list of Identified Open - OD.3.2

with the regulations published by NDMO-SDAIA. Datasets with the assigned


priorities.

- The Entity has identified & documented metadata of the


Entity's open datasets.
- The Identified & Documented - OD.3.4
Metadata for the Open Datasets.

90
Maturity Questions – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Related
Level Name Level Description Acceptance Evidence
Specification

- Value and Impact Assessment - OD.3.2


Report for Identified Open or
Public Datasets.

Level 4: - The Entity is monitoring the effectiveness of the All level 3 acceptance evidence - OD.4.1
Manage implemented processes of identifying Open Datasets with including:
d pre-defined KPIs.
- The Monitoring Report of OD
Identification & Prioritization
Processes with Pre-defined
KPIs.

Level - The Entity is conducting continuous improvement, enabled All level 4 acceptance evidence
5: by automated processes to support OD identification. requirements including:
Pionee
r - The Continuous Improvement
Report Showing the
Documented Periodic
Reviews & Outcomes of the
OD Identification Processes
and the Implemented
Automation.

Maturity Questions – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No existing process are in place for managing Open - Not Applicable.
Absence of Datasets within the Entity.
Capabilities

91
Level 1: - Open Data (OD) is shared on a reactive or ad hoc basis. - The Existing Practices for
Establishing Publishing Open Datasets.

Maturity Questions – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Related
Level Name Level Description Acceptance Evidence
Specification

Level - The Entity has defined a process to manage the - The Defined Process
2: publishing of Open Datasets as part of the documented Documentation for Publishing
Define Open Data processes. Open Data.
d
- The Entity has defined a process to regularly review,
update and document changes to its published Open
Datasets and associated metadata to ensure they meet
- The Defined Process
defined regulatory requirements.
Documentation for Open Data
Maintenance.

Level 3: - The Entity is implementing the defined process for All level 2 acceptance evidence
Activate publishing Open Data by collaborating with National requirements including:
d Information Center (NIC) in SDAIA to publish the Open
Datasets on the Saudi Open Data Portal under the KSA - Status Report on the
Open Data License. Implementation of the Open
Data Publishing Process.
- The Entity utilizes standardized formats when
publishing its datasets that are in machine-
readable form. - Evidences of Published - OD.3.3
Datasets on the Saudi Open
- The Entity ensures that the open datasets are Data Portal.
published in high quality to ensure fitness for use.

- Evidences of Feedback /
- The Entity is updating the published open datasets
Comments Received on OD.
and the associated Metadata and maintaining data
traceability and versioning history of the open
datasets.
- Evidence of formats used to - OD.3.5
standardize open datasets in
- The Entity is documenting in a register its identified open
machine readable form.
datasets and activities.

- Evidence of data standards


applied on open datasets to
ensure high data quality.

- Open Data Maintenance Report. - OD.3.6

92
- Open Data Register - OD.5.1
containing Records of Open
Data Activities and Published
Open Datasets.

Maturity Questions – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring the effectiveness of the All level 3 acceptance evidence - OD.4.1
Manage Open Data processes with pre-defined KPIs, requirements including:
d covering:
- The Monitoring Report of the
1. The number of downloads per published Open OD Publishing Process with
Dataset. Pre- defined KPIs.

2. The number of defined, identified and


prioritized Open Datasets.

3. The number of identified Open Datasets that


have been published.

4. The number of updates performed on published


Open Datasets.

Level - The Entity is conducting continuous improvement to All level 4 acceptance evidence
5: support the defined OD publishing processes to ensure requirements including:
Pionee optimization.
r - The Continuous Improvement
- The published Open Datasets and the associated Report of the OD Publication &
Metadata are regularly reviewed and updated to the Maintenance Practices.
newest version.

- The OD and the associated Metadata changes are


documented where necessary.

93
8.1.12. Freedom of Information Domain

Maturity Questions – Freedom of Information Domain

Has the Entity defined and established a plan to address its compliance with the requirements of the Freedom of Information
FOI.MQ.1
(FOI) Regulations?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place to address the Freedom of Information - Not Applicable.


Absence of (FOI) regulations within the Entity.
Capabilities

Level 1: - The Entity has existing FOI practices in place to address its - The Existing FOI Practices.
Establishing compliance with the FOI Regulations without formalized
practices.

Level - The Entity has a defined and approved FOI plan. - The Defined & Approved FOI - FOI.1.1
2: Implementation Plan &
Define - A roadmap is defined with a list of activities, required Roadmap.

d resources, training courses & awareness campaigns,


and a budget to manage the FOI plan implementation.

Level 3: - The Entity is implementing the defined plan to All level 2 Acceptance Evidence
Activated manage FOI requests. requirements, Including:

- The Entity is publishing on its official Government - The FOI Plan


website with a feedback mechanism for questions or Implementation Status
issues raised. Report.

- The Entity appointed an Open Data & Information Access


- The Assigned Open
Officer (ODIAO) to manage the Entity's compliance with
Data & Information
the FOI regulations in line with the "Data Management
Access Officer (ODIAO).
Organization" structure.

- The Entity launched awareness campaigns to promote and


- FOI Awareness. - FOI.2.1
enhance the culture of transparency and to raise
awareness of the FOI regulations published by NDMO-
SDAIA.

Level 4: - To comply with the FOI Regulations, the Entity is monitoring All level 3 Acceptance Evidence
Manage the effectiveness of the plan established with pre-defined requirements, Including:
d Key Performance Indicators (KPIs).
- The Monitoring Report on the
Entity's FOI Plan & Activities

94
with

Maturity Questions – Freedom of Information Domain

Has the Entity defined and established a plan to address its compliance with the requirements of the Freedom of Information
FOI.MQ.1
(FOI) Regulations?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity is conducting periodic audit and review Pre-defined Key Performance
processes on the compliance with FOI regulations Indicators (KPIs).
published by NDMO-SDAIA.

- The Internal audit reports on - FOI.3.6


the entity's Compliance
with the FOI Regulations.

Level - The Entity's FOI plans and practices are continuously All level 4 Acceptance Evidences
5: reviewed and optimized. required, Including:
Pionee
r - The Continuous Improvement
Report of the FOI Plan.

Maturity Questions – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No existing practices are in place for Freedom of - Not Applicable.


Absence of Information (FOI) within the Entity.
Capabilities

Level 1: - The Entity responds to FOI requests on a reactive and ad - The Existing Processes for the
Establishing hoc basis. FOI Requests & Responses.

Level - The Entity has defined processes to manage FOI - The Developed & Approved - FOI.3.1
2: requests (including response processes and appeal / FOI Request Processes &
Define denial processes) in alignment with the National Data Procedures Documentation.
d Governance Regulations.

- The Entity has developed end-to-end processes, steps


- The Developed FOI Process
and Frequently Asked questions (FAQs) for acquiring
Guide & FAQs.
information from the official Gov portal / website.

95
Maturity Questions – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: All level 2 Acceptance Evidences - FOI.3.2


Activated required including:
- The Entity is implementing the defined processes and
practices for FOI Regulatory Compliance to ensure - The Implementation / Adoption
adoption across the Entity. Status Report on the FOI
Request Processes.

- The Entity is publishing on the official portal / website, the - Evidences of Entity-wide - FOI.3.3
end-to- end process guide and the FAQs required for Communication
acquiring the information.

- An FOI Request Management process is activated - The Register of the Received - FOI.3.4
across the Entity, including access and appeal / Request Forms with the
denial practices. Responses.

- The Entity has a record keeping system which is


regularly updated on the FOI Register. - The Identified Public Datasets
Shared Under the FOI
- The Entity adopted a pricing scheme for Public Regulations.
Information Access Requests.

- Evidences of the Published FOI - FOI.3.3


Communications Including
Guidelines & FAQs on the
Entity's Official Gov Website
in Line with the NDMO
Requirements.

- The Pricing Scheme for Public - FOI.3.5


Information Access Requests.

- The Up-to-date FOI Register. - FOI.4.1

Maturity Questions – Freedom of Information Domain

96
FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring the completeness and All level 3 Acceptance Evidence
Manage performance of the FOI processes with pre-defined KPIs. requirements, Including:
d
- The Monitoring Report with
Pre- defined KPIs for the
Entity's Responses on FOI
Requests.

Level - The Entity continuously reviews & optimizes FOI All level 4 Acceptance Evidence
5: processes & practices. requirements, Including:
Pionee
r - The FOI request responding process is automated. - The Continuous Improvement
Report on the FOI Processes.

- The Automated tool for FOI


requests.

97
8.1.13. Data Classification Domain

Maturity Questions – Data Classification Domain

Has the Entity established a plan for Data Classification (DC) as stipulated by the Data Management and personal Data
DC.MQ.1
protection (DM & PDP) standards?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan for Data Classification (DC) is in place. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity has existing DC practices which are not - The Existing DC Practices.
formalized.
Establishing

Level - The Entity defined a plan for DC to manage and - The defined & Approved Data - DC.1.1
2: orchestrate its DC activities. Classification Plan.
Define
d

Level 3: - The Entity has implemented the defined DC plan and All level 2 acceptance evidence
Activated roadmap on all approved Datasets/artifacts. requirements including:

- The Data Classification


implementation plan status
report.

Level 4: - The Entity is monitoring the effectiveness of the DC plan & All level 3 acceptance evidence
Manage activities with the pre-defined Key Performance Indicators requirements including:
d (KPIs).
- The Implementation
Monitoring Report of the DC
Plan & Activities with Pre-
defined KPIs.

98
Level - The DC plan & activities are continuously reviewed & All level 4 acceptance evidence
5: updated for optimization & improvement. requirements including:
Pionee
r - Data classification plan
review report.

Maturity Questions – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No existing processes for Data Classification (DC) are in - Not Applicable.
Absence of place within the Entity.
Capabilities

Level 1: - Existing DC activities are performed on a reactive or ad hoc - The Current List of Classified
basis.
Establishing Datasets.

Level - The Entity developed a prioritization framework to - Data Classification Policy.


2: classify its Datasets.
Define
d - The Entity developed a framework for conducting the
impact assessment.
- The Data Handling and - DC.2.1
Protection Controls.
- The Entity defined Data handling Data protection
Controls for the Datasets and artifacts.

- The Entity defined & identified the DC levels in line


with the DM and PDP standards.

- The Entity defined a process for identification and


inventory of its Datasets and artifacts

Level 3: - The Entity is identifying and maintaining an All level 2 acceptance evidence - DC.3.1
Activated inventory of all datasets and artifacts owned by including:
the Entity.
- The Inventory Report of the
- The Entity is conducting prioritization based on the Identified Datasets and
identified Datasets. Artifacts.

99
- The Entity is conducting an impact analysis to assess any - The Prioritized Datasets and - DC.1.2
potential damage of unauthorized access to its identified Artifacts.
Datasets and assigning the defined DC levels in line with
the DM and PDP standards.
- Evidences of Utilization of the
- The Entity is conducting an impact analysis for data Data Catalog Tool for the Data
classified as low impact. Inventory.

- The Impact Assessment Report. - DC.3.2

Maturity Questions – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity is implementing the defined Data
Handling and Protection Controls based on the - The Assessment Report of Low- - DC.3.3

classification level for the Datasets/artifacts. Impact Data.

- The Entity is assigning Data access rights and privileges


across the Entity. - The Approved Data Access List
of users with the Assigned
- The Entity is maintaining a register of all identified Privileges.
datasets and artifacts.

- Data Register - DC.5.1

Level 4: - The Entity is monitoring the effectiveness of the DC All level 3 acceptance evidence - DC.4.1
Manage processes with pre-defined KPIs. including:
d
- The Monitoring Report of the DC
Processes with Pre-defined KPIs.

Level - The Entity is continuously improving the process for DC. All level 4 acceptance evidence
5: including:
Pionee - Innovative solutions and tools are explored to

r proactively scan and prioritize the classification of - The Data Classification


Data. Automation Tool.

Maturity Questions – Data Classification Domain

100
Has the Entity reviewed all its classified Datasets and artifacts to ensure that the classification levels assigned to them are
DC.MQ.3
the most appropriate ones as specified by the Data Classification (DC) Policies?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No existing Data Classification (DC) review processes are in - Not Applicable.
place.
Absence of
Capabilities

Maturity Questions – Data Classification Domain

Has the Entity reviewed all its classified Datasets and artifacts to ensure that the classification levels assigned to them are
DC.MQ.3
the most appropriate ones as specified by the Data Classification (DC) Policies?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 1: - Existing DC review activities are on a reactive or ad hoc - The Current Practices of DC
basis.
Establishing Reviews.

Level - The Entity defined a review mechanism for all classified - The DC Review Mechanism.
2: Datasets and artifacts to be reviewed.
Define
d

Level 3: - The Entity is implementing the defined mechanism for DC All level 2 acceptance evidence - DC.3.4
Activated reviews for the classified Datasets and artifacts across including:
the Entity.
- The Data Classification Review
- The Entity is publishing on the Data Catalog the DC Report.
levels for the reviewed Datasets and artifacts as
Metadata.
- An Evidence Document of the - DC.3.5
Published Classification Levels as
Metadata.

101
Level 4: - The Entity is monitoring the DC review mechanism with All level 3 acceptance evidence
Manage pre- defined KPIs. including:
d
- The Monitoring Report of the
DC Review Mechanism with
Pre- defined KPIs.

Level - The Entity is continuously improving its DC review All level 4 acceptance evidence
5: process to ensure optimization. including:
Pionee
r - Data classification review
mechanisms review report.

102
8.1.14. Personal Data Protection Domain

Maturity Questions – Personal Data Protection Domain

Has the Entity performed an initial Personal Data Protection (PDP) assessment and developed a plan to address the
PDP.MQ.1
strategic and operational Privacy requirements?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No assessment is done for Personal Data Protection (PDP). - Not Applicable.
Absence of
Capabilities - No plan is in place to address Data Privacy requirements.

Level 1: - The Entity is aware of the requirements for Data - Evidences of the existing
Establishing Privacy Management. practices of the PDP Domain
and Data Privacy.
- The Entity conducted an initial PDP Assessment to
evaluate the current state of the Entity's PDP practices,
identified gaps against business and regulatory
obligations including the following:
- The Initial PDP Assessment - PDP.1.1

1. Identification of the types of personal data Result.

being collected.

2. Location & method of personal data storage.

3. Current processing & uses of the personal data.

4. Privacy challenges to meet compliance with


the PDP Regulations published by NDMO-
SDAIA.

Level - The Entity developed a plan to address the Strategic and All level 1 acceptance evidence - PDP.1.2
2: Operational Data-Privacy requirements, including a list requirements including:
Define of activities, required resources, training courses and a
d budget. - The Approved PDP
implementation plan.

- The PDP Training plan.

103
Maturity Questions – Personal Data Protection Domain

Has the Entity performed an initial Personal Data Protection (PDP) assessment and developed a plan to address the
PDP.MQ.1
strategic and operational Privacy requirements?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: - The Entity is implementing the defined PDP & Data All level 2 acceptance evidence
Activate Privacy program in a formalized manner. requirements including:
d
- This program includes implementing activities which - The PDP Plan Implementation
ensure the proactive identification and resolution of Status Report.
potential Privacy issues and risks including training and
awareness for all employees to promote a Personal
Data Protection-centric culture. - Evidence of PDP training - PDP.2.1
activities conducted.

Level 4: - The Entity is monitoring the effectiveness of the PDP All level 3 acceptance evidence
Manage plan with pre-defined Key Performance Indicators requirements including:
d (KPIs).
- The Monitoring Report for
the PDP Plan with pre-
defined KPIs.

Level - The Entity periodically reviews its PDP plan to ensure All level 4 acceptance evidence
5: sustained compliance with the applicable regulations, and requirements including:
Pionee other environmental requirements or impacts.
r - Review report and continuous
improvement of the personal
data protection plan.

Maturity Questions – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Related
Level Name Level Description Acceptance Evidence
Specification

104
Level 0: - No practices are in place to identify and address Data - Not Applicable.
Absence of breaches nor Privacy violations.
Capabilities

Maturity Questions – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 1: - The Entity identifies Data breaches or Privacy - Evidences of the existing
Establishing violations on a reactive basis without standardized initiatives for PDP and Data
practices nor policies. Privacy.

- The Entity addresses violations when ad hoc Data


Privacy interventions occur.

Level - The Entity has developed policies and processes for - The documented Data Breach
2: PDP and Data Privacy including the identification of Notifications Process.
Define Data breaches, Data Privacy considerations, privacy
d notice & consent management, and compliance
management to comply with the applicable - The documented Data Breach - PDP.3.2

regulations. Management Process.

- Governance has been defined to manage compliance


- Entity-Specific PDP Policies.
with the PDP regulations in line with the Data
Management Organization structure.

- The PDP & Data Privacy - PDP.4.1


- The Entity has defined a process for Data Subjects'
Notice and the Consent
rights management.
Management Process.

- The Data Subjects' Rights - PDP.4.2


Management Processes.

105
Level 3: - The Entity is implementing and standardizing the All level 2 acceptance evidence - PDP.4.1
Activate developed and approved practices for PDP including the requirements including:
d Consent Management Workflow, notifications to
regulatory authority as required and data breach - The Developed & Adopted

management. Consent Management


Workflow.
- The PDP policies and standardized processes for data
subjects rights have been published on the Entity's
- Evidences of Notifications - PDP.3.1
official government website with a feedback
Sent to the Regulatory
mechanism for questions or issues raised by the data
Authority within the Allotted
subjects.
Timeframe.

Maturity Questions – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity maintains a register of its compliance
records including records of any collection and / or - Evidences of Data Breach

processing of any Personal Data (e.g. Identified Data Management Including

breaches, etc.). The register is made available to Identified Data Breaches.

regulators as required.

- The Results of the PDP Risk - PDP.4.3


Assessments.

- Evidences of Published Data - PDP.4.2


Subjects Rights Management
Processes and Feedback
Received from Data Subjects.

- The PDP Register. - PDP.5.1

Level 4: - The Entity is monitoring the effectiveness of its PDP All level 3 acceptance evidence
Manage practices and compliance with the rules & regulations requirements including:
d through pre-defined KPIs.
- The Monitoring Report for the
- The Entity is conducting periodic audits on compliance PDP & Data Privacy Practices
with PDP rules & regulations. with pre-defined KPIs.

106
- The Compliance Monitoring - PDP.4.4
Report & Audit Results.

Level - The Entity continuously reviews its PDP practices to All level 4 acceptance evidence
5: ensure sustained compliance with regulations. requirements including:
Pionee
r - Continuous improvement, enabled by automation & - The Documented Periodic
change management, is performed in line with the Reviews & Outcomes for the
defined regulatory requirements. PDP & Data Privacy
Practices.

- Evidences of Automation &


Change Management for
PDP.

107
8.2. Appendix II – Acceptance Evidence Checklists
The acceptance evidences outlined in Appendix I are detailed into
acceptance criteria in this section. This helps to streamline the assessment
journey and ensures coverage of all aspects required for a systematic
evaluation.

8.2.1. Data Governance Domain

Checklist – Data Governance Domain

Has the Entity established & implemented a Data Management & Personal Data Protection (DM & PDP) Strategy and a DM &
DG.MQ.1
PDP Plan with Key Performance Indicators (KPIs) that can be continuously measured to ensure optimization?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Existing data-related - Attach a report showing current data practices.


Establishing practices.

Level - The approved DM & PDP - Attach the Data Management and Personal Data Protection (DM & PDP)
2: Strategy. Strategy which must be approved & includes the following, as a minimum:
Define
d  Current challenges in DM.

 The Strategic requirements including:

 Internal requirements emanating from the Entity's business


strategy.

 External requirements emanating from the National


Strategy for DM & PDP.

 The DM & PDP Vision, Mission, and Strategic Objectives.

 Strategic and operational performance indicators with targets


extending from 3 to 5 years over the strategy’s
implementation duration.

 The financial budget required to implement the strategy


divided according to identified initiatives.

- The DM guiding - Attach a copy of the DM & PDP program guidelines, which may be a part of
principles. the Entity's DM & PDP Strategy doc or a separate doc. The guidelines must
refer to:

108
 The principles underlying the culture of DM & Data processing,
to enable and spread a unified DM & PDP concept within the
Entity itself.

Checklist – Data Governance Domain

Has the Entity established & implemented a Data Management & Personal Data Protection (DM & PDP) Strategy and a DM
DG.MQ.1
& PDP Plan with Key Performance Indicators (KPIs) that can be continuously measured to ensure optimization?

Levels Acceptance Evidence Acceptance Criteria

- Data Strategy Approval - Attach a copy of the entity-specific data strategy approval decision by the
Decision. Entity’s Data Management Committee and/or other related senior level
executives within the Entity.

Level 3: - The developed DM & - Attach the implementation plan which must include, as a minimum:
Activate PDP implementation
d plan.  Identified initiatives that covers all the DM domains.

 A prioritized list of DM initiatives with descriptions.

 A three-year Implementation roadmap to close the gaps


identified between the current state and target state.

- Implementation status - Attach a report on the implementation status including, as a minimum:


report.
 The achievement percentages of the initiatives and projects
included in the DM & PDP implementation plan.

Level 4: - The monitoring report - The report must be prepared based on the data of the Key Performance
Manage of the DM & PDP Indicators (KPIs) (Indicator Cards) which were pre-defined in the DM & PDP
d strategy & plan strategic plan, and each indicator’s data or card should include the following,
implementation with as a minimum:
the pre-defined KPIs.
 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic/operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

109
 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator
value is the target, Negative Polarity: Lower indicator value
is the target).

Checklist – Data Governance Domain

Has the Entity established & implemented a Data Management & Personal Data Protection (DM & PDP) Strategy and a DM &
DG.MQ.1
PDP Plan with Key Performance Indicators (KPIs) that can be continuously measured to ensure optimization?

Levels Acceptance Evidence Acceptance Criteria

Level - The Continuous - Attach a report showing that the Entity identified, implemented and has been
5: Improvement Report of monitoring the continuous improvement mechanisms for DM & PDP Strategy.
Pionee the DM & PDP Strategy.
r - The report shall include the following:

 The updated DM & PDP strategy.

 The updated DG practices that include the Data


Management Organization, the roles, the processes,
& the technologies.

 The continuous improvement mechanisms for all DM Domains.

- The Continuous - Attach a report showing that the Entity identified, implemented and has been
Improvement Report of monitoring the continuous improvement mechanisms for DM & PDP Plan.
the DM & PDP Plan.
- The report shall include the following:

 The documented periodic DM & PDP reviews & results.

 Documented changes to the initial approved plan where


applicable.

Checklist – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

110
Level 1: - The Data Management - The gap analysis document shall include the following as a minimum:
Establishing & Personal Data
Protection (DM & PDP)  An analysis of DM & PDP Standards and Guidelines
Policies, Controls and established by the National Data Management Office (NDMO).
Guidelines Gap
 Identifying & analyzing all Data-related policies & controls
Analysis document.
published by the Oversight Entities & the regulator(s) of the
sector to which the Entity belongs.

 An analysis of the Entity's internal requirements for DM & PDP


Controls.

 Recommendations and suggestions to close the gaps identified


as well a specific list of policies that the Entity will develop in
line with the Policies published by NDMO-SDAIA.

Checklist – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Levels Acceptance Evidence Acceptance Criteria

 A plan to develop DM & PDP policies and controls, clearly


presenting the implementation timeline.

Level - The developed DM and - Every Policy’s document shall include, the following, as a minimum:
2: PDP policies, standards
Define and guidelines  Policy Name.
d covering all DM
 Release date.
Domains as required.
 Version number.

 Document control (Preparation, review, approval).

 Version history.

 Terminology.

 Goal.

 Scope of work.

 Guiding Principles

 Policy Statement.

 Job roles & responsibilities.

 Related Policies.

 References.

 Policy Owner.

Level 3: - Implementation status - Attach a report showing the implementation status of the developed DM and
Activated report. PDP policies, processes and standards, and include, as a minimum:

 Implementation achievement percentages.

111
- A document proving - The submitted evidence must be approved or issued by the Entity’s authority
the Entity's approval & holder.
adoption of the
developed policies,
standards &
guidelines.

- Approved Compliance - Attach the Compliance Framework, detailing the following:


Management
Framework.  The scope of the compliance audit’s periodic procedure.

 The processes needed to plan and perform the


compliance audit procedure.

 The processes & tools needed for reporting a compliance audit


results.

Checklist – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Levels Acceptance Evidence Acceptance Criteria

 The processes & plans for remediating & escalating non-


compliance cases.

Level 4: - The monitoring reports - The report must be prepared based on the data of the Key Performance
Manage for the developed Indicators (KPIs) (Indicator Cards) which were pre-defined to measure the
d policies, processes and Entity’s Performance in monitoring the developed Policies & Standards
standards with pre- implementation status, and each indicator’s data or card should include the
defined KPIs. following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic/operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value

112
is the target).

- The Entity's - With the compliance report, attach a report documenting the results &
Compliance Audit outputs of each procedure implemented to monitor compliance. The
Results Report. compliance report shall include the following, as a minimum:

 Compliance or Non-Compliance with each specification.

 An explanation for compliance results with sufficient evidence


for each specification.

 Recommendations to remediate each instance of non-compliance.

 Accountable stakeholder for each recommendation, and the


target date to complete the recommendation.

- Compliance Monitoring - Attach a report(s) to confirm that the Entity has established and monitored
Report. compliance points by implementing a periodic compliance audit procedure.
For the report to be accepted, it must be:

 Compliant with the national framework for compliance


in data management and governance.

Checklist – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Levels Acceptance Evidence Acceptance Criteria

 Include compliance audit scores generated by the periodic


compliance audits.

Level - Continuous - Attach a report showing that the Entity identified, implemented & is monitoring
5: Improvement Report of continuous improvement mechanisms, including as a must:
Pionee the updated DM and
r PDP policies, standards  A report containing samples of the updated policies,
and updated DG standards & processes related to DM, DG & PDP.
processes.

Checklist – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the
DG.MQ.3
NDMO Controls & Specifications?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

113
Level 1: - The Entity's Data - Attach the decision to establish a Data Management Office to
Establishing Management Office supervise the implementation of the national strategy for DM, DG
establishment decision. & PDP.

- To accept the attachment, the following requirements must be met:

 The decision must be issued by the Entity’s authority holder.

 The Office’s responsibilities are documented in line


with the “Organizational Manual”.

- The - To accept the attachment, the following requirements must be met:


appointment/hiring
decisions of the  The decision must be issued by the Entity’s authority holder.
following roles:
 The CDO’s responsibilities are clarified in an approved job
description.
A. Chief Data
Officer (CDO)  The responsibilities are in line with the “Organizational Manual”.
Hiring /
Appointment
Decision.

Checklist – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Levels Acceptance Evidence Acceptance Criteria

- The appointment/hiring - To accept the attachment, the following requirements must be met:
decisions of the
following roles:  The decision must be issued by the Entity’s authority holder.

 The Data Management Officer / Data Governance


B. Data
Officer’s responsibilities are clarified in an
Management
approved job description.
Officer / Data
Governance  The responsibilities are in line with the "Organizational Manual”.
Officer Hiring /
Appointment
Decision.

Level - Entity Data - To accept the attachment, the following requirements must be met:
2: Management & Data
Define Governance  The decision must be issued by the Entity’s authority holder.
d Committee formation
 The Committee’s Charter including:
decision.
 Roles and Responsibilities.

 Rules of work for the committee.

114
- The Hiring / - To accept the attachment, the following requirements must be met:
Appointment
decisions of the  The decision must be issued by the Entity’s authority holder.
following roles:
 The Compliance Officer’s responsibilities are clarified in an
approved job description.
A. Compliance
Officer Hiring /  The responsibilities are in line with the “Organizational Manual”.
Appointment
Decision.

- The Hiring / - To accept the attachment, the following requirements must be met:
Appointment
decisions of the  The decision must be issued by the Entity’s authority holder.
following roles:
 The Business Data Executive’s responsibilities are
clarified in an approved job description.
B. Business Data
Executive  The responsibilities are in line with the “Organizational Manual”.
Hiring /
Appointment
Decision.

- The Hiring / - To accept the attachment, the following requirements must be met:
Appointment
decisions of the  The decision must be issued by the Entity’s authority holder.
following roles:
 The Legal Advisor’s responsibilities are clarified in an
approved job description.
C. Legal Advisor
Hiring /  The responsibilities are in line with the “Organizational Manual”.
Appointment
Decision.

Checklist – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - The Hiring / - To accept the attachment, the following requirements must be met:
Activated Appointment
decisions of the  The decision must be issued by the Entity’s authority holder.
following roles:
 The Business Data Steward’s responsibilities are clarified in an
approved job description.
A. Business Data
Steward(s)  The responsibilities are in line with the “Organizational Manual”.
Hiring
/
Appointmen
t Decision.

115
- The Hiring / - To accept the attachment, the following requirements must be met:
Appointment
decisions of the  The decision must be issued by the Entity’s authority holder.
following roles:
 The IT Data Steward’s responsibilities are clarified in an
approved job description.
B. List of the IT
Data Stewards.  The responsibilities are in line with the Organizational Manual.

- The Hiring / - To accept the attachment, the following requirements must be met:
Appointment
decisions of the  The decision must be issued by the Entity’s authority holder.
following roles:
 The PDP Officer’s responsibilities are clarified in an
approved job description.
C. Personal Data
Protection  The responsibilities are in line with the Organizational Manual.
(PDP) Officer
Hiring /
Appointment
Decision.

- The Hiring / - To accept the attachment, the following requirements must be met:
Appointment
decisions of the  The decision must be issued by the Entity’s authority holder.
following roles:
 The ODIAO’s responsibilities are clarified in an approved job
description.
D. Open Data
and  The responsibilities are in line with the Organizational Manual.
Information
Access
Officer
Hiring /
Appointment
Decision.

- The documented & - Attach the documented & approved Data Management Organization
approved Data structure, including the following, as a minimum:
Management
Organization structure.  Roles & responsibilities.

 Approved job descriptions for each role.

 Authorization: Reviews, approvals and decision making.

Checklist – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Levels Acceptance Evidence Acceptance Criteria

116
- The documented Data - Attach the documented & approved Data Stewardship / Data Ownership
Stewardship / Ownership structure, including the following, as a minimum:
structure.
 Identified Data Domains.

 Business Data Executives for each Data Domain.

 Business Data Steward for each Data Domain.

 IT Data Steward for each Data Domain.

 Roles & responsibilities.

Level 4: - The monitoring reports - The report must be prepared based on the data of the Key Performance
Manage for the Entity's Data Indicators (KPIs) (Indicator Cards) which were pre-defined for the Data
d Management Management Organization roles, and each indicator’s data or card should
Organization roles include the following, as a minimum:
with pre-defined KPIs.
 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic/operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - Attach a report showing that the Entity identified, implemented & is monitoring
5: Improvement Report continuous improvement mechanisms within the Entity for the:
Pionee for the DM
r Organization and  The Data Management Organization.
Data Stewardship.
 The Data Stewardship / Ownership Structure.

Checklist – Data Governance Domain

Has the Entity established and implemented practices for Change Management including awareness, communication,
DG.MQ.4
change control, and capability development?

117
Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of - Attach evidences showing that the entity has communications regarding Data
Establishing communication on DM- Management practices, such as:
related practices.
 E-mails.

 Correspondences.

- Evidences of training - Attach a report on the training courses & awareness programs implemented by
and awareness sessions the Entity, including the following, as a minimum:
conducted.
 Names of the implemented training courses related to DM, DG &
PDP.

 Awareness programs implemented in the different Domains


under DM, DG & PDP.

 A certificate of attendance sample for training the Entity’s


employees in Domains related to DM, DG & PDP.

Level 2: - The Change - The Change Management Plan covering all DM Domains including:
Defined Management Plan.
 The DM & PDP Training Plan for all DM Domains.

 The DM Communication plan.

 The Stakeholders engagement plan.

 The Change Control plan for Data system changes.

Level 3: - The Change - Attach a report on the implementation status of the change management plan,
Activated Management and the report must include, as a minimum:
Implementation Status
Report showing the DM  The DM & PDP Training implementation status for all DM Domains
& PDP Training showing:
activities.
 Evidences of training conducted for the Entity’s employees,
covering all DM Domains.

 A list of activities carried out by the Entity to raise awareness


regarding the national regulations, laws, policies & controls,
standards and their applicability (such as: e-mails,
publications, lectures, workshops, etc.).

 A list of activities carried out by the Entity to raise awareness


regarding the national DM, DG & PDP strategy and programs
and their applicability on the Entity (such as: e-mails,
publications, lectures, workshops, etc.).

Checklist – Data Governance Domain

118
Has the Entity established and implemented practices for Change Management including awareness, communication,
DG.MQ.4
change control, and capability development?

Levels Acceptance Evidence Acceptance Criteria

 A list of activities carried out by the Entity to raise awareness


regarding the data management domains as per the National
Data Management and Personal Data Protection Framework,
addressed to the related Data Management and Personal
Data Protection roles.

- The Change - Attach a report on the implementation status of the change management plan,
Management and the report must include evidences confirming the Entity's continuous
Implementation Status communication regarding the following:
Report showing the
DM Communication  The DM, DG & PDP Program, activities and main decisions.
activities.
 The storage of DM & DG documents & artifacts.

 The measurement of DM & DG performance indicators.

 The updates on DM & DG policies & processes.

 The updates on compliance reports, implementation plans,


and the regulatory (legislative) environment related to DM
& DG.

- The Stakeholders - Attach a report showing that the Entity is engaging the identified stakeholders to
Engagement and develop and improve on the capabilities of the DM & PDP program.
Socialization Plan
implementation status
report.

- The Data Governance - The register should include, as a minimum:


Approvals Register.
 CDO’s DG Decisions with their rationalized constituents.

 DM & DG Committee Decisions.

- The Data Management - The register should include, as a minimum:


Issue Tracking
Register.  A sample of DM & DG issues reported by business & technical users.

 Evidence of resolution of the reported issues.

- Evidences that the - Attach a sample of a register / record or a document created by the Entity,
Entity has DM & DG and it must include the following as a minimum:
document & artifact
Version Control  The Domain name of the registry.
practices.
 The Registry’s Issuance Date.

 The Registry’s Updating Dates.

 The Registry Versions Control.

 The Updates made to the Registry or document.

119
Checklist – Data Governance Domain

Has the Entity established and implemented practices for Change Management including awareness, communication,
DG.MQ.4
change control, and capability development?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The monitoring report - A report on monitoring the Change Management activities & practices should
Manage of the Change be attached based on pre-defined KPIs, which include, as a minimum, the
d Management following indicators:
practices with pre-
defined KPIs.  The Indicator of the periodic meetings of the internal DM &
DG & PDP committee.

 The Indicator of the completed training & awareness sessions.

 The Indicator of the attendance rates in the completed training &


awareness sessions.

 The duration Indicator of the DM & PDP issue resolutions.

 The quantity Indicator of the number of resolved & closed change


requests.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic/operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - Attach a report showing that the Entity identified, implemented and is
5: Improvement Report monitoring mechanisms for continuous improvement for Change Management &
Pionee of the Change Spreading Awareness about all Domains under DM, DG & PDP.
r Management
Practices for all DM
Domains.

120
8.2.2. Metadata and Data catalog Domain

Checklist – Metadata and Data Catalog Domain

MCM.MQ.1 Has the Entity developed and implemented a plan to integrate and manage Metadata across the Entity?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.

Absence of
Capabilities

Level 1: - The Recorded or - The Entity must attach a report on the Metadata recorded or documented as part
Establishing Documented Metadata of any project’s implementation, or in standalone applications.
Report.

Level - The Approved - The Entity must attach the approved MCM tool / solution implementation plan
2: Metadata and Data including the following, as a minimum:
Define Catalog Plan.
d  A roadmap that includes the projects and milestones of the
technological tool / solution implementation for the Data
Catalog. The activities shall incorporate what is needed to
achieve this Domain’s specifications, as a minimum.

 The assignment of the required resources & budget allocation to


manage the implementation of the Data Catalog Automation
tool.

- The Approved - The Entity must attach an updated and approved report illustrating the approved
Metadata Structure Metadata architecture and Metadata framework. The report must include the
and Framework. following, as a minimum:

 The Business Metadata fields to be filled in the Data Catalog.

 The additional fields based on the Entity’s requirements.

Level 3: - The Metadata and Data - The Entity must attach an updated and approved report clarifying the MCM Plan
Activated Catalog Plan implementation status.
Implementation Status
Report.

- The Metadata - The Entity must attach an updated and approved report clarifying the Metadata
Management Management Framework implementation status.
Framework
Implementation
Status Report.

121
Checklist – Metadata and Data Catalog Domain

MCM.MQ.1 Has the Entity developed and implemented a plan to integrate and manage Metadata across the Entity?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report - The Entity must attach an updated and approved monitoring report on the
Manage with Pre-defined Key MCM Plan and activities, prepared based on pre-defined KPIs (Indicator Cards).
d Performance
Indicators (KPIs) for - e.g.: The indicators may include measuring the achievement percentages
the Metadata and of the plan implementation requirements.
Data Catalog Plan
and Activities. - Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - The Entity must attach an updated & approved report including the following, as a
5: Improvement Report minimum:
Pionee on the Metadata and
r Data Catalog Plan.  The documents of the periodic reviews & documented results of
the MCM Plan.

 The continuous improvement mechanisms of the MCM Plan.

122
Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Existing - The Entity must attach a report proving the existence of the identified Metadata
Establishing Metadata. within standalone applications or for a specific project.

Level - The Selected Data - The Entity must attach an approved report clarifying the technology / tool chosen
2: Catalog Tool. for the Data Catalog (The information includes: The Vendor, version, and
Define specifications).

- The Approved and - The Entity must attach an approved report that includes the following, as a
minimum:
Prioritized Data Sources
Report.
 The Data Catalog Sources where each Source has an approved
priority.

 The definitions of the Entity’s business Metadata and the


technical Metadata.

123
- The Developed and - The Entity must attach an approved report clarifying the developed and
Approved Target approved target Metadata architecture including (but not limited to), the
Metadata following:
Architecture.
 Metadata Sources the Entity's Data Sources that are sources of
Metadata used in the Data Catalog.

 Metadata Repository the Data Catalog as the Entity's central


Metadata Repository.

 Metadata Flows A definition of Metadata Flow between the


Metadata Sources and the Metadata Repository.

 Metadata Model A Metadata Model used by the Entity's Data


Catalog.

Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

- The Data Catalog Tool - The Entity must attach an approved report clarifying the Data Catalog tool
Implementation implementation requirements, including the following, as a minimum:
Requirements Report.
 The procurement approval for a tool to automate the Data Catalog.

 The approved plan to implement the Data Catalog automation tool.

124
- The Approved and - The Entity must attach the developed and approved training and awareness
Developed Data Catalog Plan regarding the Data Catalog usages, including the following, as a minimum:
Training Plan.
 The training and awareness plan’s approval and validity / expiry
dates.

 The training and awareness programs plan implementation dates.

 The scope of the training and awareness.

 The objectives of the training and awareness.

 The training shall include the following:

 An overview of the Data Catalog concept and benefits.

 Introductory and advanced lessons on the Automated Data


Catalog Tool and its functionalities.

 Practical (Hands on) exercises based on use cases on the


automated Data Catalog tool.

 The training and awareness target audiences.

 The topics of the training programs and awareness campaigns.

 The methods and channels through which the Training


plan will be conducted.

 Identifying the MCM Domain Awareness campaign channels, which


include:

 E mails or mobile phone messages.

 Publications.

 Lectures or workshops.

Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - Evidence of the - The Entity must attach a report on the Data Catalog tool proving the tool is
Activated Implemented implemented in the Entity.
Data Catalog
Tool.

- Data Access Approval - The Entity must attach an approved report about the documentation of the

125
Process Documentation approval process for granting authorization access to the Entity’s Data by
(Authorization to connecting the Data Catalog tool to the Entity's Data Sources.
Connect the Data
Catalog with the Data
Sources).

- Metadata - The Entity must attach an approved report about the documentation of the
Access approval process for granting scope-based authorization access to the Metadata,
Approval including the following, as a minimum:
Process
Documentation.  The name of the system / tool and version number.

 A description of the access granting and approval process.

 A sample of role-based access groups.

- Evidences of Data - The Entity must attach a report proving the adoption and usage of the Data
Catalog Adoption and Catalog, with evidence for populating Metadata in the tool, including the
Usage Including following, as a minimum:
Metadata Populated
on the Tool.  The identified Data Catalog power users – The Entity's Data
Catalog advanced users who can act as coaches for other
users.

 The Communication plan between the current Data Catalog


power users and the other Data Catalog users to activate the
Data Catalog usage. The plan must include (but not limited to)
the following:

 A description of the communication procedure actions &


processes.

 The required frequency of the communication processes.

 The target audience.

- The Regular - The Entity must attach the periodic audits report on the Data Catalog usage,
Audits Report including the following, as a minimum:
on the Data
Catalog Usage.  An audit sample of information about the Data Catalog users.

 An audit sample of processes / operations performed by the users.

Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

126
Levels Acceptance Evidence Acceptance Criteria

 An audit sample of the Data Catalog activity memory and


separately maintained logs of the tracking functionality /
audit-trail.

- Evidence of Training - The Entity must attach a report proving the implementation of the training &
Conducted for the awareness programs conducted for the identified Data Catalog users. The
Identified Data report must include the following, as a minimum:
Catalog Users.
 The list of conducted training programs and awareness campaigns
(Including the topics of the programs, the dates when they were
conducted, and the names of the training attendants).

 Samples of the activities performed by the Entity to raise


awareness about the MCM Domain, including:

 E mails or mobile phone messages.

 Publications.

 Lectures or workshops.

 A sample attendance certificate of training the Entity’s employees.

- Tool Versioning Report. - The Entity must attach a Tool Versioning report including the following, as a
minimum:

 The name of the developer of the current tool used by the Entity.

 The number of the current version used by the Entity.

 The current version’s release date within the Entity.

 The number and date of the latest tool version published by the
developer.

 The Methodology of the Version Management Strategy (Releasing


versions of technological solutions and tools for the Data Catalog)
followed in cases where there are issues that prevent upgrading
the current version to the latest version.

127
Level 4: - The Monitoring Report - The Entity must attach a monitoring report on the Metadata and on the Data
Manage with Pre-defined KPIs Catalog usage based on pre-defined KPIs (Indicator Cards) which cover:
d for the Adoption and
Usage of the  The number of registered Data Catalog users.

Metadata & Data


 The number of active Data Catalog users.
Catalog Solution /
 The number of logins to the Data Catalog.
Tool.

 The number of performed Metadata queries.

Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

 The number of annotations (tags, comments) added to the Data


assets.

 The number of ratings added to data assets.

 The number of trust certificates assigned to the Metadata.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

128
 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator
value is the target, Negative Polarity: Lower indicator value
is the target).

- The Approved List of - The Entity must attach a list of approved KPIs (Indicator Cards) pre-defined to
Pre- defined KPIs for monitor the Metadata Quality, including the following, as a minimum:
the Quality of the
Metadata which is  Completeness.

Populated on the
 Accuracy.
Data Catalog Tool.
 Consistency.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value

129
is the target).

Level - The Continuous - The Entity must attach an updated & approved report presenting continuous
5: Improvement Report improvement on the MCM tool, including the following, as a minimum:
Pionee on the Metadata and
r Data Catalog Tool's  The documents of the periodic reviews & documented results.

quality.
 The continuous improvement mechanisms.

 The resources assigned for implementing the continuous


improvement plan.

- The Metadata - The Entity must attach an updated and approved report proving the full automation
Management of end to end Metadata Management using a fully implemented Data Catalog tool,
Automation Report. e.g.: Automating Metadata collection and exchange (Evidence such as reports,
screenshots, etc…).

130
Checklist – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3 population, access management, and quality issue management, etc., supported & fostered by collaboration across
the Entity?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidence of the - The Entity must attach a document of the current Processes followed for
Establishing Existing Processes Metadata Management (Evidence such as reports, screenshots, etc…).
Used to Manage
Metadata.

Level - Metadata Identification - The Entity must attach a report showing the entity’s process for identifying and
2: Process Report. defining its business and technical metadata that will be included in the data
Define catalog.
d

- Metadata Prioritization - The Entity must attach a report showing the entity’s process for prioritizing the
Process Report. identified metadata.

- Metadata Population - The Entity must attach a report showing the process for registering and
Process Report. populating the Metadata within the Data Catalog so that the process is
implemented as a workflow within the automated data catalog tool.

- Metadata Update - The Entity must attach a report showing the process for updating metadata in
Process Report. its data catalog so that the process is implemented as a workflow within the
automated data catalog tool.

- Metadata Quality - The Entity must attach a report showing the process for identifying and
Process Report. resolving quality issues with the Metadata. This Metadata Management
process should include a mechanism for reporting identified data quality
issues and development of remediation actions within defined SLAs so that the
process is implemented as a workflow within the automated data catalog tool.

- Metadata Annotation - The Entity must attach a report showing the process for regularly reviewing
Process Report. the metadata annotations (tags, comments) made by users to the Metadata
within the Data Catalog so that the process is implemented as a workflow
within the automated data catalog tool.

- Metadata Certification - The Entity must attach a report showing the process for regularly reviewing the
Process Report. trust certificates assigned by users to the Metadata within the Data Catalog so
that the process is implemented as a workflow within the automated data
catalog tool.

131
Checklist – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3 population, access management, and quality issue management, etc., supported & fostered by collaboration across
the Entity?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - Evidence of the - The Entity must attach an approved report proving the implementation and
Activated Implementation and adoption of the approved processes, including each process’s workflow chart.
Adoption of the The processes must include the following, as a minimum:
Approved Processes as
Workflows in the  Metadata Identification Process.
Entity's Data Catalog.
 Metadata Prioritization Process.

 Metadata Population Process.

 Metadata Updating process.

 Metadata Quality Process.

 Metadata Annotation Process.

 Metadata Certification Process.

- The Logs or the List of - The Entity must attach a report that includes the logs, or a list of alerts caused
Notifications on the by Metadata changes.
Metadata Changes.

- Evidences of - The Entity must attach the implementation report of contacting the Data
Communications to the Catalog users (e.g.: emails) when any Metadata is updated.
Data Catalog Users of
any Metadata Update.

- The Metadata - The Entity must attach a report clarifying the Metadata Stewardship Coverage
Stewardship Coverage Model.
Model.

Level 4: - The Metadata - The Entity must attach an updated and approved report on monitoring the
Manage Processes Monitoring Metadata Processes based on pre-defined KPIs (Indicator Cards), and each
d Report with Pre- indicator’s data or card must include the following, as a minimum:
Defined KPIs.
 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

132
Checklist – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3 population, access management, and quality issue management, etc., supported & fostered by collaboration across
the Entity?

Levels Acceptance Evidence Acceptance Criteria

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

- The Metadata - The Entity must attach an updated and approved report on monitoring the
Quality Monitoring Metadata Quality based on pre-defined KPIs (Indicator Cards) including the
Report with Pre- following, as a minimum:
Defined KPIs.
 Completeness.

 Accuracy.

 Consistency.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

133
Level - The Continuous - The Entity must attach an updated & approved report including the
5: Improvement Report on following, as a minimum:
Pionee the Metadata
r Management Practices.  The documents of the periodic reviews & documented
results of the Metadata Management Practices.

 The continuous improvement mechanisms of the Metadata


Management Practices.

134
8.2.3. Data Quality Domain

Checklist – Data Quality Domain

DQ.MQ.1 Has the Entity developed and implemented a Data Quality (DQ) plan focused on improving the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidence of the - The Entity must attach a report on the activities that are practiced in the DQ
Establishing Existing Data Quality Domain. For each activity, the report must contain the following, as a
(DQ) Related Activities. minimum:

 Copies / screenshots of e-mail correspondences which


support practicing the activity.

 Copies / screenshots of documents which support practicing the


activity.

 Screenshots of the systems which support practicing the activity.

Level - The Defined and - The Entity must attach a defined and approved DQ management plan to
2: Approved DQ implement and manage the activities which aim at improving the Entity’s
Define Implementation Plan. Data. This plan must contain the following, as a minimum:
d
 A roadmap that includes the activities and milestones of
implementing the DQ Management practices in the Entity. The
activities shall incorporate what is needed to achieve this
Domain’s specifications, as a minimum.

 The assignment of the required resources & budget allocation to


manage the implementation of the DQ Management Plan.

Level 3: - A Report on the DQ Plan - The Entity must attach an updated & approved report on the DQ plan
Activated Implementation Status. implementation status containing the following, as a minimum:

 The DQ Domain activities and the implementation status of every


activity.

- A Report on the - The Entity must attach an updated & approved report presenting the
Defined DQ Roles & implementation status of the DQ Domain’s defined roles and responsibilities
Responsibilities. containing the following, as a minimum:

 The DQ Roles of the Data Stewards.

 The Responsibilities of the Data Stewards in the DQ Domain


operations.

- A Report on the - The Entity must attach an updated & approved report presenting the current
Assigned Resources status of the resources allocated for the implementation of the DQ activities
for the DQ Plan. containing the following, as a minimum:

 The people, processes and tools needed to implement the DQ

135
activities.

Checklist – Data Quality Domain

DQ.MQ.1 Has the Entity developed and implemented a Data Quality (DQ) plan focused on improving the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report - The Entity must attach an updated & approved monitoring report on the DQ
Manage of the DQ Plan and activities implementation plan based on the KPIs (Indicator Cards) which were
d Activities with Pre- pre-defined in the DQ implementation plan. Each indicator’s data or card must
defined Key include the following, as a minimum:
Performance Indicators
(KPIs).  Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - The Entity must attach an updated & approved report presenting continuous
5: Improvement Report of improvement in the DQ Plan & Activities containing the following, as a
Pionee the Data Quality Plan & minimum:
r Activities.
 The review document & the review results.

 The DQ continuous improvement plan.

136
Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing DQ Domain - The Entity must attach a report on the current DQ Domain initiatives.
Establishing Initiatives.

- The Existing Processes - The Entity must attach a report on the current DQ issue detection processes.
for Detecting DQ Issues.

- The Existing Processes - The Entity must attach a report on the current processes used for Data
Used for Data corrections or Data validations.
Corrections or Data
Validations.

Level - The Prioritized List of - The Entity must attach a report listing the Data elements ranked / prioritized
2: Data Elements. based on business requirements. The first priority Data must include the Entity’s
Define Master Data, as a minimum.
d

- The Defined DQ - The Entity must attach a report clarifying the defined DQ Dimensions for
Dimensions for the the Entity's Datasets including the following, as a minimum:
Entity's Datasets.
 Completeness (The degree of how complete data records are).

 Uniqueness (The degree of how unique data records are


without duplicates).

 Timeliness (The degree to which data is up to date and


available when it is needed).

 Validity (The degree of records' conformance to the established


formats, types and ranges).

 Accuracy (The degree to which data values align to real values).

 Consistency (The degree to which data is consistent across the


Entity’s business and across different sources).

137
Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

- The Data Quality Rules - The Entity must attach a report clarifying the list of defined DQ Rules that are
Report. aligned with the DQ Dimensions including the following, as a minimum:

 The DQ rule owner.

 The business description of the requirement to be validated by


the rule.

 Regarding the particular data whose quality is being


measured: The assignment of Rules to each of the DQ
Dimensions including the following, as a minimum:

 Completeness.

 Uniqueness.

 Timeliness.

 Validity.

 Accuracy.

 Consistency.

 The list of Data Attributes validated by the defined rules.

 The metrics that are calculated when validating each DQ rule.

 The escalation threshold that triggers a DQ alert for the rule.

- The Developed & - The Entity must attach a report clarifying the Developed & Approved
Approved Processes for Processes for DQ Issue Management & Remediation followed for DQ issue
DQ Issue Management management & remediation / resolution including, as a minimum, the
& Remediation. following:

 The development of a remediation plan. The remediation plan must

include the following, as a minimum:

 A "Root Cause" analysis to determine the causes of the


identified DQ issue.

 An Impact analysis to assess the negative consequences and


determine an issue level (whether it is local / limited, or at the
Entity level).

 The definition & identification of the DQ Targets set for each of


the issues & challenges, related to each DQ Dimension,
depending on the context of the issue / challenge within the

138
Entity.

 The definition & identification of the options for resolving


the issue's "Root Cause", including a feasibility analysis.

Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

 The specifications of the Data Cleansing process to be


performed if the solution does not correct the "Root Cause" of
the DQ issue.

 The decision & the logical rationale for selecting the


specific option (chosen) to solve the issue.

 The implementation status of the issue's most suitable


resolution choice (including any change).

 The review of the implemented resolution and a verification that


the issue is resolved.

 Establishing & developing a roadmap and milestones for the

resolution of the noticed & identified DQ issues.

 Allocating the necessary resources to implement the DQ issue


identification plan.

Level 3: - The Planned & - The Entity must attach a report clarifying the initial & periodic DQ assessment,
Activate Conducted Initial & which was planned for and conducted (already). The assessment must include
d Periodic Data Quality the following, as a minimum:
Assessment Report.
 Collecting business requirements for the Quality of Data in the scope.

 Establishing & defining DQ Rules based on the collected business


requirements specifically.

 Performing a check by Data Profiling based on the pre-defined DQ Rules.

 Reporting the discovered and identified DQ comments & issues to the concerned /
relevant department in the Entity.

 Developing plans (with key milestones) for resolving the discovered & identified
DQ issues.

- The Resolution Status - The Entity must attach a report presenting the resolution status of the identified
Report of the Identified DQ issues, including:
DQ Issues.
 The implementation status of the issue's most suitable resolution

139
choice.

 The review of the implemented resolution and a verification that


the issue is resolved.

Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

- Evidences of DQ Tools - The Entity must attach a report showing the activation of DQ tools used to
Used for Automating implement DQ issue management and presenting the following tools
DQ Issue Management capabilities, as a minimum:
Workflows.
1. Data Profiling Statistical Data analysis on the following levels: Data
attributes, tables, cross Domain and different systems.

2. DQ Rules Management DQ rules establishment, development and


execution.

3. DQ Issues Management Automation of workflows for reporting and


resolving DQ issues.

- The Defined & - The Entity must attach the defined DQ SLAs including the following, as a
minimum:
Implemented DQ
Service Level
 A timeline and a deadline for the development of a
Agreements (SLAs).
remediation plan for the identified DQ issue.

 A timeline and a deadline for the implementation and the


review of the DQ changes.

 The escalation actions to be taken when the SLA is not met.

- The List of DQ System - The Entity must attach a report clarifying the list of DQ System Enforcements
Enforcements Adopted adopted by the Entity (e.g., The Standard Implementation Mechanism & the
by the Entity. Standard Implementation Result).

140
Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report - The Entity must attach an updated & approved report on monitoring the
Manage of the DQ DQ practices based on pre-defined KPIs (Indicator Cards).
d Management Practices
with Pre-defined KPIs. - Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly /


Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity: Lower
indicator value is the target).

- The Monitoring Report - The Entity must attach an approved report on monitoring and updating the
of the DQ Threshold threshold Values for each DQ Rule.
Values.

- The Monitoring Report - The Entity must attach an updated & approved monitoring report on
of the DQ Issue the DQ issue resolution process including the following, as a minimum:
Resolution Process.
 The number of resolved DQ issues vs the number of reported DQ
issues.

 The number of DQ issues resolved after the specified deadlines.

141
 The total time of remediation plan development for a
discovered DQ issue.

Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

 The total time of resolving a DQ issue (A “Root Cause”


resolution implementation).

Level - The Continuous - The Entity must attach an updated & approved report showing that the Entity
5: Improvement Report is periodically monitoring & regularly optimizing the tools used for DQ.
Pionee on the Tools Used for
r DQ.

- The Continuous - The Entity must attach an updated & approved report showing that the Entity
Improvement Report identified, implemented and is regularly monitoring the continuous
on the DQ improvement mechanisms of the DQ Domain practices.
Management
Practices.

- The Implemented /
- The Entity must provide evidence of the adopted DQ Management Standards
Adopted DQ Industry
(e.g., ISO 8000).
Standards.

Checklist – Data Quality Domain

DQ.MQ.3 Has the Entity established and implemented practices to monitor and report the Entity's Data Quality (DQ) status?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing DQ - The Entity must attach an updated & approved report clarifying the Entity’s
Establishing Monitoring Practices. current DQ monitoring practices.

142
- Evidence of the Entity's - The Entity must attach an updated & approved report showing the current DQ
Current DQ Status. situation.

Checklist – Data Quality Domain

DQ.MQ.3 Has the Entity established and implemented practices to monitor and report the Entity's Data Quality (DQ) status?

Levels Acceptance Evidence Acceptance Criteria

Level - The Defined and - The Entity must attach the defined DQ monitoring plan showing activities
2: Formalized DQ required to monitor and document the data quality status on a regular with
Define Monitoring Plan. the assigned resources for implementation.
d

- The Defined DQ - The Entity must attach a report clarifying specific Checkpoints defined for DQ
Checkpoints Report. Monitoring.

Level 3: - DQ Scorecards or - The Entity must attach a report illustrating DQ scorecards or DQ dashboards.
Activated Dashboards. This should include the following, as a minimum:

 Execution of the defined DQ rules which according to defined


triggering conditions (time schedule, event).

 Reporting of the noticed and identified DQ issues to Data


Stewards & Owners, (As a minimum: A Business Data
Steward & a Business Data Executive).

- A Report on the Data - The Entity must attach a report presenting the DQ Metadata registered in the
Quality Metadata Data Catalog Tool as per the process identified in the MCM Domain (Data
Logged on the Data Catalog & Metadata Management / Management of the Catalog & Metadata).
Catalog Tool. The DQ Metadata must include the following, as a minimum:

 The existing DQ Rules.

 The DQ Monitoring process results.

- Evidence of a Data - The Entity must attach a report illustrating DQ Support implemented as a
Quality Support Process workflow process to solve the issues discovered during the DQ reviews. The
Implemented as a report must include the following, as a minimum:
Workflow.
 A clear process that enables data users to report DQ issues to
Business Data Stewards.

 A diagram of the workflow in the automated Data Catalog tool.

- The Results of the DQ - The Entity must attach a report clarifying the review / audit results of the DQ
Checkpoint Reviews. Checkpoints including the following, as a minimum:

 A log containing the detected DQ issues.

 A remediation plan for the detected DQ issues.

143
Level 4: - Trends from the DQ - The Entity must attach an updated & approved report about the trends of the
Manage Monitoring Activities DQ monitoring activities which are based on the data of the pre-defined KPIs
d with Pre-defined KPIs. (indicator cards) including the following, as a minimum:

Checklist – Data Quality Domain

DQ.MQ.3 Has the Entity established and implemented practices to monitor and report the Entity's Data Quality (DQ) status?

Levels Acceptance Evidence Acceptance Criteria

 The number of DQ issues reported based on the


established & implemented DQ Rules.

 The number of DQ issues reported by the Data Catalog users.

 The number of the DQ Rules deployed.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - A Continuous - The Entity must attach an updated & approved report clarifying the
5: Improvement Report on monitoring and reporting practices that have been reviewed and
Pionee the DQ Monitoring and optimized to raise DQ including the following, as a minimum:
r Reporting Practices.
 The documents of the periodic reviews & documented results
of the DQ monitoring & reporting practices.

 The continuous improvement mechanisms of the DQ


monitoring & reporting practices.

144
Checklist – Data Quality Domain

Has the Entity developed Data Quality (DQ) standards, provided definitions for its Datasets, and published / uploaded the
DQ.MQ.4
definitions on the National Data Catalog (NDC)?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing List of Data - The Entity must attach an updated & approved report listing the current DQ
Establishing Standards and Data standards and explaining the Data definitions.
Definitions.

Level - The Developed Data - The Entity must attach an updated & approved report clarifying the
2: Standards for Data Standards developed for the Data Elements.
Define Elements.

- The Identified Metadata - The Entity must attach an updated & approved report clarifying the
with their Definitions. identified Metadata with their definitions.

- The Identified Datasets to - The Entity must attach an updated & approved report showing the identified
be Published on the datasets to be published on the National Data Catalog (NDC).
National Data Catalog
(NDC).

Level 3: - A Report about the Data - The Entity must attach an updated & approved report proving that the Entity’s
Activate Standards & the Data Data Standards & Data Definitions have been published on the National Data
d Definitions which the Catalog (NDC).
Entity Uploaded on the
NDC.

- The Entity-Specific List of - The Entity must attach an updated & approved report listing the approved
Applied Data Definitions Entity specific definitions & Data standards.
and Applied Data
Standards.

Level 4: - A Monitoring Report for - The Entity must attach an updated & approved report on monitoring Data
Manage the Data Definitions and Definitions and Data Standardizations based on pre-defined KPIs (indicator
d Data Standardization cards), e.g.:
with Pre- defined KPIs.
 The percentage of defined Data Elements.

145
 The percentage of business Metadata attributes which are
defined and published to the NDC.

Checklist – Data Quality Domain

Has the Entity developed Data Quality (DQ) standards, provided definitions for its Datasets, and published / uploaded the
DQ.MQ.4
definitions on the National Data Catalog (NDC)?

Levels Acceptance Evidence Acceptance Criteria

 The percentage of technical Metadata attributes which are


defined and published on the NDC.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - A Continuous Improvement - The Entity must attach an updated & approved report presenting regular
5: Report for Optimizing the reviews and continuous improvements of Data Standards & Data Definitions
Pionee Data Standards and in both: within the Entity and on the National Data Catalog (NDC), including
r Definitions within the the following, as a minimum:
Entity and on the NDC.
 The documents of the periodic reviews & documented
results of the Data Standards & Data Definitions.

 The continuous improvement mechanisms of the Data


Standards & Data Definitions.

146
147
8.2.4. Data Operations Domain

Checklist – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and
DO.MQ.1
Data retention?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Approved Initial Data - The Entity must attach the initial Data Operations and
Establishing Operations and Storage Plan. Storage Plan including the following, as a minimum:

 A roadmap that includes the activities and


milestones.

 The assignment of the required resources &


budget allocation.

Level - The Developed and Approved Data - The Entity must attach the developed and approved Data
2: Operations and Storage Plan. Operations and Storage Plan including the following, as a
Define minimum:

d
 A roadmap that includes the activities and
milestones.

 The assignment of the required resources &


budget allocation.

 A prioritized list of the information systems


(based on their criticality to the business
progress).

- The approved Policies for Data Operations, storage and


retention, as well as business continuity, wherein each Policy
document includes the following:

 Policy title / name.

 Policy owner.

 Release date.

 Version number.

 Version history.

 Objective.

 Identified stakeholders, e.g.: Target audience.

148
 Communications process.

Checklist – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and
DO.MQ.1
Data retention?

Levels Acceptance Evidence Acceptance Criteria

 Monitoring procedure.

 Document control (Preparation, review, approval).

 Policy Statement.

 Roles & responsibilities.

 Terminology.

 Scope of work.

 Activation mechanisms.

 Approval.

 References.

- The Information Systems Priority - The Entity must attach a list of prioritized information systems
List.
that must be followed to establish a system recovery order in the
Disaster Recovery (DR) plan.

- The Developed and Approved - The Entity must attach a report on the Policies which are
Policies for Data Operations, approved for Data Operations, storage, retention, and business
Storage, Retention and Business continuity, including the following, as a minimum:
Continuity.
 Storage conditions which ensure Data
protection in disaster events.

 Data retention periods based on Data type, Data


Classification (DC) level, Data value for/in the
business, and legal requirements.

 The disposal and destruction rules based on


the Data type and classification level.

 The required actions in the event of an


accidental permanent loss of Data.

149
- The Periodic Forecasting Plan for - The Entity must attach a periodic forecast report on proactive
Storage Capacity. planning to satisfy the expected storage capacity requirements,
including the following, as a minimum:

 Historical Data storage capacities used.

 Performance Evaluation of the Entity’s


applications and needs.

Checklist – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and
DO.MQ.1
Data retention?

Levels Acceptance Evidence Acceptance Criteria

 A list of upcoming applications to be developed


and built for the Entity.

- The Database Technology - The Entity must attach the Database Technology Evaluation
Evaluation Process and Selection Process and Selection Criteria including the following, as a
Criteria. minimum:

 The Total Cost of Ownership (TCO) including, at


least: licensing, support, training, and
hardware.

 The availability of resources skilled in these


technological solutions, both internally and in the
job market.

 The presence of software tools related to the


Database technologies being evaluated in the
Entity.

 Volume and velocity limits of the utilized


technologies.

 Reliability provided by the utilized technologies.

 Scalability of each technology.

 Security controls provided by the utilized


technologies.

Level 3: - The Data Operations and Storage - The Entity must attach the Data Operations and Storage Plans
Activated plan with implementation status with a report clarifying the implementation status, including
report. the following, as a minimum:

 The achievement percentages of the actions /


tasks included in the Data Operations and
Storage Plans.

150
- The Storage Trend Forecast - The Entity must attach a report clarifying the expected storage
Document. capacity, including the following, as a minimum:

 Forecasting the Entity’s future needs of Data


storage capacity.

 Estimating budgets for future purchases of Data


storage space.

- The Database Technology - The Entity must attach a Database technology assessment report
Evaluation Report. including the following, as a minimum:

 The performance of Database technologies.

Checklist – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and
DO.MQ.1
Data retention?

Levels Acceptance Evidence Acceptance Criteria

 The availability of technical support for the


current version of each Database
technology.

 Technical support availability for software


tools and operating systems of the
Databases.

- The Document on the Entity's - The Entity must attach an evaluation report on the
Application Performance performance of the Entity's applications, including the
Assessment. following, as a minimum:

 The number of transactions by each application


used in the Entity.

 The percentage % of application utilization to


perform the Entity’s work in an automated
manner.

 Availability – Users Accessibility to the


Entity’s applications.

151
- The Entity's Applications - The Entity must attach a roadmap document for the upcoming
Development Roadmap with the applications to be developed and a report on their development
Status Report on the statuses, including the following, as a minimum:
Implementations.
 The activities and milestones.

 Application development prioritization.

 Each application’s development status.

- The Budget Estimations of the - The Entity must attach a report clarifying the estimated budgets for
Procurement Transactions of the future procurements of Data storage processes.
Future Storage Needs.

- The Data Operations Orchestration - The Entity must attach a document on Data Operations
Document. Orchestration, including assigning teams to operate and maintain
systems Data, plus to process and analyze Data.

Level 4: - The Monitoring Report with Pre- - The Entity must attach a report prepared based on pre-
Manage defined Key Performance defined KPIs (Indicator Cards), and each indicator’s data or
d Indicators (KPIs) for the card should include the following, as a minimum:
Implementation Progress and
Effectiveness of the Data  Indicator’s Name / Code.

Operations Plan.

Checklist – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and
DO.MQ.1
Data retention?

Levels Acceptance Evidence Acceptance Criteria

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be


measured (with identifying the Specification or
the Process to which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity,


etc.).

 Baseline (Measurement value in the first


measurement year).

 Target value.

152
 Measurement Periodicity (Monthly / Quarterly /
Biannually
/ Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity:
Lower indicator value is the target).

Level - The Continuous Improvement - The Entity must attach a report presenting that the Entity
5: Report on the Practices of Data specified, implemented, monitored, and reviewed
Pionee Operations, Storage, and mechanisms for continuous improvement of the Data
r Retention. Operations and storage plans, and satisfying Data retention
needs. The report must include the following, as a minimum:

 The documents of the periodic reviews &


documented results of the Data Operations Plans
and Data Retention needs.

153
Checklist – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for
DO.MQ.2
Database operations?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Data Operations - The Entity must attach evidences of the Database operations
Establishing Done within the Entity. performed by the Entity (e-mails, screenshots).

Level - The Process Documentation of - The Entity must attach the overall Process of the Detailed
2: the Detailed Practices of Data Practices of Database Monitoring including:
Define Operations, Including:
d  The process for monitoring and reporting
A. Database Monitoring. database performance on regular basis.

B. Database Access Control. - The Entity must attach the Detailed process for providing the
Entity's employees access to the databases.

C. Storage Configuration - The Entity must attach the detailed Practices of Storage
Management. Configuration Management including:

 Configuration identification.

 Configuration change control.

 Configuration status accounting.

 Configuration audits.

D. DBMS Versioning Mechanism.


- The Entity must attach the overall Process of the Detailed
Practices of DBMS Versioning / Updating Mechanism
Covering:

 The Management Plan of Updated Releases /


Versions.

 The Strategy Including Analysis and Rationale.

154
Checklist – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for
DO.MQ.2
Database operations?

Levels Acceptance Evidence Acceptance Criteria

E. The Database Performance - The Entity must attach the process for defining Database
Service Level Agreements and Performance Service Level Agreements and Operational Level
Operational Level Agreements Agreements (SLAs & OLAs).
(SLAs & OLAs).

Level 3: - The Database Monitoring Status - The Entity must attach a monitoring status report on
Activated Report. Database performance, including the following, as a
minimum:

 Capacity – The size of the unused storage.

 Availability – Users’ accessibility to the


Entity’s Databases, as & when needed.

 Queries Execution Performance – Query execution


times, durations and errors.

 Tracking Changes – Tracking Database changes


for root cause analysis, as & when needed.

- The Data Operations Operating - The Entity must attach the approved Operational Model of
Model.
Database Operations, including the following, as a
minimum:

 The teams’ roles and responsibilities.

 The procedures and practices of Database


Operations management.

 The technologies used to support Database


Operations.

- Evidences of Agreements (SLAs and - The Entity must attach a report on Database performance
OLAs). agreements, e.g.: Service Level Agreements (SLAs) and
Operational Level Agreements (OLAs); including the following,
as a minimum:

 The timeframe of making the Database

155
available for users.

 The maximum time allowed to complete


electronic transactions on a specific
application.

 Each agreement must clarify the escalation


procedures to be followed when the agreement is
violated.

Checklist – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for
DO.MQ.2
Database operations?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report with Pre- - The Entity must attach a report prepared based on pre-
Manage defined KPIs for Database defined KPIs (Indicator Cards), and each indicator’s data or
d Operations Management. card should include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be


measured (with identifying the Specification or
the Process to which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity,


etc.).

 Baseline (Measurement value in the first


measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly /


Biannually
/ Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity:

156
Lower indicator value is the target).

Level - The Continuous Improvement - The Entity must attach a periodic evaluation report on
5: Report on the Implemented continuous improvement of the following, as a minimum:
Pionee Practices and Processes of Data
r Operations and Storage.  The processes, procedures and practices followed
during Database Operations.

 The operational metrics (Service Level


Agreements (SLAs), Operational Level
Agreements (OLAs), Key Performance
Indicators (KPIs), Key Quality Indicators
(KQIs)).

Checklist – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for
DO.MQ.2
Database operations?

Levels Acceptance Evidence Acceptance Criteria

- The Knowledgebase document. - The Entity must attach a Knowledgebase document


including the following, as a minimum:

 All lessons learnt, test / trial cases and user stories.

 The log of errors and issues.

 The solutions of the recorded errors and issues.

Checklist – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR)
DO.MQ.3
and a defined Business Continuity Plan (BCP) for the Data?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Data Storage Backup - The Entity must attach a sample document of the Data storage
Establishing Instruction Documents. backup instructions (e-mail, guide / manual).

157
Level - The Developed Processes and - The Entity must attach the process and practices developed for
2: Practices for Data Storage the "Data Storage Backup" and "Data Backup Recovery",
Define Backups and Recovery. including the following, as a minimum:
d
 Defining and determining the backup frequency
of each information system.

 Scope of backup for each information system


including the scope of Data and the scope of
Database transaction logs.

 Location of backup files including a storage


medium and a physical location.

 Periodic validations of backup completions using


system copies in non-production environments.

Checklist – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR)
DO.MQ.3
and a defined Business Continuity Plan (BCP) for the Data?

Levels Acceptance Evidence Acceptance Criteria

158
- The Business Continuity Plan - The Entity must attach processes and plans that ensure work
(BCP) and the Disaster progress without disruption, including the following, as a
Recovery (DR) Plan and the minimum:
Processes.
 The Business Continuity Processes and Plan
(BCP) including, as a minimum:

 Risk assessment and business impact analysis.

 Roles and responsibilities.

 Alternative work locations.

 The infrastructure.

 Incident management.

 The communication mechanisms of the


continuity procedures to notify the relevant
stakeholders, e.g.: Employees and
customers,

 The Disaster Recovery (DR) Processes and


Plan including, as a minimum:

 A prioritized list of information systems


defining their recovery order.

 Assigning the roles responsible for


addressing / handling incident cases and
responding.

 Defining and specifying the procedural


actions to be taken to activate a response to
each incident through the system.

 Defining and specifying the procedural


actions to be taken to reduce the damage
and mitigate the incident consequences on
the Entity's critical operations.

 Definition & identification of Recovery Point


Objectives (RPO) (A maximum targeted period
within which Data might be lost without
causing damage to the business) for each
information system covered in the plan.

 Definition of Recovery Time Objectives (RTO) (A


maximum targeted duration of time within
which the Database can be down without
causing damage to

159
Checklist – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR)
DO.MQ.3
and a defined Business Continuity Plan (BCP) for the Data?

Levels Acceptance Evidence Acceptance Criteria

the business) for each information system


covered in the plan

 Definition of Recovery activities.

 Periodic training courses with scenario-based


trial simulations to evaluate the response and
identify areas for improvement.

- A Document of Actions Required to - The Entity must attach a document explaining the definitions of
Implement Database Changes and the measures and the procedural action steps to be taken to
Rollbacks. implement the changes on each Database or to rollback / undo
the changes as & when needed.

Level 3: - The Technical Design Document. - The Entity must attach the Technical Design document
Activated including the following, as a minimum:

 Accurate information about the technical


infrastructure of the main location and the
Disaster Recovery (DR) location.

 The processes and procedures which are


required for Data recovery.

 System architecture, including hardware,


software, and network structure.

 Tasks and responsibilities of the technical


team responsible for DR.

 Testing and maintenance procedures of


the DR processes.

- The BCP Run Report. - The Entity must attach the BCP Run Report, i.e., the output doc
after BCP execution, explaining the results, recommendations,
etc. which ensure the continuous availability of critical business
functions during adverse / damaging incidents. The following
must be included, as a minimum:

 The implementation status of the BCP,

160
practices and processes.

 The implementation status of the DR Plan,


practices and processes.

Checklist – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR)
DO.MQ.3
and a defined Business Continuity Plan (BCP) for the Data?

Levels Acceptance Evidence Acceptance Criteria

- The Change Request for - The Entity must attach the approved Change Request form /
Production Data. template used for requesting a Data change in the production
environment.

- The Production Data Access Control - The Entity must attach the Access Control document of Data
Document. in the Production Environments (based on the authorizations
matrix and the roles), including the following, as a minimum:

 The change requests initiating the process.

 Definition and identification of procedural


actions to be taken for a controlled
implementation of changes in the Databases.

 Definition and identification of procedural


actions to be taken for rollback / reversing the
changes in cases of identified issues.

161
Level 4: - The Monitoring Report with Pre- - The Entity must attach a monitoring report prepared based on
Manage defined KPIs for the Business pre-defined KPIs (Indicator Cards) for the following:
d Continuity (BCP) and Disaster
Recovery (DR).  Data Storage Capacity Utilization:

 The percentage % of total capacity used.

 The percentages % of capacities used by the


type of Database.

 The percentage % of capacity used for backups.

 The number of performed Data transactions.

 The average time of queries execution.

 The BCP implementation monitoring KPIs Report.

- Each indicator’s data or card should include the following, as a


minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

Checklist – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR)
DO.MQ.3
and a defined Business Continuity Plan (BCP) for the Data?

Levels Acceptance Evidence Acceptance Criteria

 The strategic / operational objective to be


measured (with identifying the Specification or
the Process to which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity,


etc.).

 Baseline (Measurement value in the first


measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly /


Biannually
/ Annually).

 Data sources used to calculate the indicator.

162
 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity:
Lower indicator value is the target).

Level - The Continuous Improvement - The Entity must attach a detailed report with clarity and
5: Report on the Business Continuity details on optimizing the following, as a minimum:
Pionee Plan (BCP).

r  BCP.

 Processes for production environment


authorization access control.

 Processes for Data backup and recovery.

 DR plan.

- The continuous improvement report must include the


following, as a minimum:

 The documents of the periodic reviews &


documented results.

 The continuous improvement mechanisms.

163
8.2.5. Document and Content Management Domain

Checklist – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Existing DCM - Evidences must be attached proving the existence of DCM activities,
Establishing Activities. including what supports practicing a DCM activity such as:

 Copies / screenshots of e-mails.

 Copies / screenshots of documents.

 Screenshots of systems / tools.

Level - The DCM Plan. - The DCM plan must include the following, as a minimum:
2:
Define  A roadmap that includes the activities and milestones for
d the implementation of Documents and Content
Management activities. The activities shall incorporate
what is needed to achieve this Domain’s specifications, as
a minimum.

 The assignment of the required resources & budget


allocation to manage the implementation of Documents and
Content Management processes.

- The DCM Digitization Plan. - A report must be attached containing the DCM Digitization Plan and the
implementation status with the necessary details as follows, as a minimum:

 Roadmap with the activities and key milestones for the


migration of the Entity's existing paper-based documents to
the electronic format.

 Roadmap with the activities and key milestones for the


implementation of initiatives focused on eliminating a
creation of paper-based documents in the Entity and
replacing them with electronic documents.

 Assignment of the required resources and budget to


manage the implementation of paperless management

164
initiatives.

Checklist – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Levels Acceptance Evidence Acceptance Criteria

- The Developed - The Prioritization Process of Documents and Content must include the
Prioritization Process for following, as a minimum:
Documents and
Content.  The definition of “Data Prioritization”.

 The prioritization matrix.

 The identification and definition of DCM procedures.

 The ranked list of prioritized document workflows.

- The Identified DCM Processes - Based on the approved policies, the Entity must attach a report on the
/ Procedures. approved procedures followed for Data Operations (DO), storage,
retention and business continuity, containing the following:

 Storage conditions ensuring Data protection in disaster events.

 Data Retention periods based on Data type, classification,


business value and legal requirements.

 Disposal and destruction rules based on the Data type


& Data classification.

 The actions required in the event of an accidental


permanent loss of Data.

- The Ranked List of - The Entity must attach a report clarifying a ranked list of tasks for each
Prioritized Document prioritized Workflow of Documents, including:
Workflows to be
Implemented.  The definition and rating of the workflows based on the
level of importance and impact on business
operations.

 The tasks of each prioritized Workflow based on the


compliance requirements, risk factors, and operational
needs.

 A clear implementation roadmap for each Workflow


including the timelines and resource assignments.

165
- The Developed DCM - The Entity must attach the developed DCM training plan, including:
Training Plan.
 The target audience groups identified in the training plan,
such as employees, managers, and IT staff.

 The developed training curricula, including presentations,


handouts and online resources, designed / customized
specifically to satisfy the identified & defined needs.

 The training delivery staff’s assigned responsibilities,


including trainers or facilitators.

 The training schedule.

Checklist – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - The DCM Plan - A report must be attached clarifying the DCM Plan’s implementation status
Activated Implementation Status including the following, as a minimum:
Report.
 The achievement percentages of the actions / works
included in the DCM Plan.

- The Digitization Plan - A report must be attached clarifying the Digitization Plan’s
Implementation Status implementation status including the following, as a minimum:
Report.
 The achievement percentages of the actions / works
included in the Digitization Plan.

- The List of Prioritized - A report must be attached clarifying the prioritized list of documents to be
Documents for Digitization. digitized based on the Inventory of Data Elements within the Entity.

- The DCM Roles and - The Entity must attach the approved DCM Roles & Responsibilities.
Responsibilities.

- The DCM Training Plan - The DCM Training Plan Implementation status report must be attached
Implementation Status including the following, as a minimum:
Report.
 The objectives of the training program with the topics covering:

 Introduction of the DCM Policies.

 Introductory and advanced tutorials about DCM systems


used by the entity and their functionalities.

 The dates and implementation status.

Level 4: - The Monitoring Reports - The Entity must attach monitoring reports prepared based on pre-

166
Manage with Pre-defined KPIs for defined KPIs (Indicator Cards), including, as a minimum:
d the Implementation of
the DCM Plan & the  The Monitoring Report with approved & pre-defined KPIs for
DCM Digitization Plan. the Implementation of the DCM Plan, with the Indicator
Card(s) attached.

 The Monitoring Report with approved & pre-defined KPIs


for the Implementation of the Digitization Plan, with the
Indicator Card(s) attached.

- In both reports, each indicator’s data or card should include the


following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

Checklist – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Levels Acceptance Evidence Acceptance Criteria

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - A report must be attached showing that the Entity identified, implemented
5: Improvement Report of and is regularly monitoring the Continuous Improvement mechanisms for
Pionee the DCM and Digitization both the DCM Plan & the Digitization Plan, including the following, as a
r Plans. minimum:

 The DCM plan continuous improvement mechanisms.

Checklist – Document and Content Management Domain

167
Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup
DCM.MQ.2
& Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Processes for - Evidences must be attached, covering the Processes of Document
Establishing Retaining & Disposing of Retention and Disposal, including what supports practicing each
Documents. procedure / process such as:

 Copies / screenshots of e-mails.

 Copies / screenshots of documents.

 Screenshots of systems / tools.

Checklist – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup
DCM.MQ.2
& Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Levels Acceptance Evidence Acceptance Criteria

168
Level - The Developed Policy - Across the DCM Lifecycle, the Entity must attach the developed Policies for
2: Document for the DCM each of the following, as a minimum:
Define Lifecycle.
d  The Naming Convention Policies.

 Policies for assigning classification levels to documents.

 The Access Approval Policies.

 The Backup & Recovery Policies.

 The Retention & Disposal Policies.

- Every Policy’s document must include, the following, as a


minimum:

 Policy Name.

 Policy Owner.

 Release date.

 Version number.

 Document control (Preparation, review, approval).

 Version history.

 Terminology.

 Goal.

 Scope of work.

 Principles

 Policy Statement.

 Job roles & responsibilities.

 Related Policies.

 References.

- The Developed DCM Backup - The Entity must attach reports showing evidence that the Entity has
& Recovery Procedures. included the Document and Content Management Systems within its
overall backup and recovery plan for DCM Backup & Recovery.

- The Developed DCM - The Entity must attach reports containing the documentations of the
Retention & Disposal developed DCM Retention & Disposal Processes/ Procedures.
Procedures.

- The Developed DCM Role- - The Entity must attach reports containing the documentations of the
Based Access Approval developed DCM Access Approval & Role-Based Access Authorization
Procedures. Processes.

Checklist – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup
DCM.MQ.2
& Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

169
Levels Acceptance Evidence Acceptance Criteria

- The Developed DCM - The Entity must attach reports containing the documentations of the
Metadata Publishing developed processes for the following:
Procedures.
 Publishing the Metadata of Documents & Contents.

 Classifying Documents & Contents.

Level 3: - The Implemented Workflow - Evidence must be attached for the workflow performed for the Retention
Activated for the DCM Retention & & Disposal process. This shall include:
Disposal Process.
 Evidence of handover of documents to the Entity's archival
facility.

 Evidence of physical destruction of documents, including


overwriting and a secure deletion.

- The Implemented Workflow - Evidence must be attached for the workflow performed for the Access
for the DCM Access Authorization Approval process based on job roles & job tasks (A copy of the
Approval Process. work procedure steps).

- Evidences of Document - Evidences must be attached to prove transferring documents to the Entity’s
Transfers to the Archival Archival Facility / Archives Unit. (Registering the archive in a system), e.g.:
Facility (Archival Register). Copies of Archived Documents.

- The Documents Disposal - Evidences must be attached to prove the existence of a Register for logging
Register. the Disposal of Documents (A copy of the Documents Disposal Register).

- The Access Rights - The Entity must attach the Access Rights Documentation including the
Documentation. following, as a minimum:

 The mechanism of accessing documents and contents


clarifying the dependence of the role and job tasks.

 The identified Access Groups based on Data Classification


Domain Standards & Controls.

- The Report on Document & - The Entity must attach a report on Publishing Metadata of Documents &
Content Metadata Contents, including the following, as a minimum:
Publishing.
 The Metadata Standards.

 The Forms / Templates.

 The Data Catalog Integration document.

Checklist – Document and Content Management Domain

170
Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup
DCM.MQ.2
& Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report - The Entity must attach a monitoring report prepared based on pre-
Manage with Pre-defined Key defined KPIs (Indicator Cards), including, as a minimum:
d Performance Indicators
(KPIs) of the DCM  The size / volume of the Entity’s Documents stored in the
Processes Aligned with Entity's Document Management System (DMS).
the Policies.
 The number of system / tool users.

 The percentage % of the migrated records.

 The number of users of the Entity’s Document Management


System (DMS).

 The percentage % of the paper-based documents


transformed into electronic formats.

- Each indicator’s data or card should include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured


(with identifying the Specification or the Process to
which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement


year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly /


Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity: Lower
indicator value is the target).

Level - The Continuous - A report must be attached clarifying that the Entity identified, implemented
5: Improvement Report on and is regularly monitoring the Continuous Improvement mechanisms for
Pionee the DCM Processes and the DCM Processes, including the following, as a minimum:
r Practices.
 The continuous improvement mechanisms of the DCM
Processes.

 The review documents and the periodic results of


the DCM Processes.

171
Checklist – Document and Content Management Domain

Has the Entity implemented a tool to support Document and Content Management (DCM) processes including Digitization
DCM.MQ.3
Management implementation?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable - Not Applicable


Absence of
Capabilities

Level 1: - Evidences of Document - The Entity must attach a report on the current activities related to
Establishing Storage / Retention & document storage, document retention, and document disposal, including
Disposal Activities. one of the following to support practicing each activity, e.g.:

 Copies / screenshots of e-mails.

 Copies / screenshots of documents.

 Screenshots of systems / tools.

Level - The Documented DCM Tool - The Entity must attach the documented DCM Tool requirements,
2: Requirements. including the following, as a minimum:
Define
d  The functional requirements of the tool.

 The non-functional requirements of the tool.

 The tool’s use case template / form.

Level 3: - The Implemented Tool for - The Entity must attach evidences that the implemented DCM tool
Activate DCM. includes the following, as a minimum:
d
 Document Management System - an application used to capture,
store and manage documents in an electronic format (electronic
documents and digital media). The selected DMS tool shall provide,
at minimum, the following capabilities:

 Storage of documents.

 OCR (Optical Character Recognition)


functionality to analyze imported images.

 Indexing of documents.

 Versioning of documents including tracking


of the history of changes.

 Secured access to documents.

 Global search and discovery on the


registered documents.

 Documents workflows development.

172
Checklist – Document and Content Management Domain

Has the Entity implemented a tool to support Document and Content Management (DCM) processes including Digitization
DCM.MQ.3
Management implementation?

Levels Acceptance Evidence Acceptance Criteria

 Web Content Management System - an application used to store and


manage website Content used by the Entity's portals and internet
sites.

 Collaboration tools – applications providing users with platform to


collaborate real-time on electronic documents, communicate using
chat and track changes in the documents.

- The Record of Digitized - The Entity must attach an evidence of the number of Digitized Documents
Documents. (A copy of the Record of Digitized Documents).

- The List of Approved Users. - The Entity must attach a list of the approved users of the DCM system / tool.

Level 4: - The Monitoring Report on - The Entity must attach a monitoring report prepared based on pre-defined
Manage the Implemented Tool with KPIs (Indicator Cards), and each indicator’s data or card should include the
d Pre- defined Key following, as a minimum:
Performance Indicators
(KPIs).  Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - A report must be attached clarifying that the Entity identified, implemented
5: Improvement Report on and is regularly monitoring the Continuous Improvement mechanisms for
Pionee the DCM Solution / Tool the implemented tool performance.
r Design and DCM Practices.

173
174
8.2.6. Data Architecture and Modelling Domain

Checklist – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The existing DAM Domain - The Entity must attach a report clarifying the current DAM practices.
Establishing related practices.

Level - The Approved DAM Plan. - The DAM implementation plan must include the following, as a minimum:
2:
Define  Roadmap with the activities & milestones for the Target State
d Data Architecture. The activities shall incorporate what is
needed to achieve this Domain’s specifications, as a
minimum.

 The assignment of the required resources & budget to


manage the implementation of the Target State Data
Architecture.

- The Approved Current - The Entity must attach a report on the Current State Data Architecture and
State Data Architecture & the Technical Architecture to support the development of the Target
the Existing Technical Architecture. The Current Data Architecture & the existing Technical
Architecture. Architecture shall cover, as a minimum:

 A Data model at the conceptual, logical and physical levels.

 The current processes used in conducting business


processes & decision making.

 The Key System Components – The current essential


applications, data storages, data processing platforms, and
data analytics solutions used in key processes.

 Data flow and Data lineage.

175
- The Approved Target - The Entity must attach the Target Data Architecture including the
State Data Architecture. following, as a minimum:

 A Data model at the conceptual, logical and physical levels.

 The targeted processes used in conducting business


processes & decision making.

 The Key System Components – The targeted essential


applications, data storages, data processing platforms, and
data analytics solutions used in key processes.

 Data flow and Data lineage.

Checklist – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Levels Acceptance Evidence Acceptance Criteria

- The Future State Gap - The Entity must attach a report on the Future-State Gap Assessment
Assessment. including the following, as a minimum:

 An analysis of the gaps between the Current Data


Architecture and the Target Data Architecture.

- The Approved Enterprise - The Entity must attach the Enterprise Architecture (EA) Framework and the
Architecture Framework. general Model of the EA components so that the Government Entity would
have a comprehensive business & IT blueprint that’s linked & aligned with
the Entity’s strategic objectives.

- The DAM Policy. - The Entity must attach the DAM Policies including the following for each
policy, as a minimum:

 Policy Name.

 Release Date.

 Release Number.

 Document Control (Preparation, Review, Approval).

 Version History.

 Terminologies.

 Goal.

 Scope of Work.

 Roles & Responsibilities.

 References.

 Policy Owner.

 Policy Statement including the following:

176
 The Target Data Architecture shall address the
strategic requirements defined within the Entity's
Data Management & Personal Data Protection (DM &
PDP) Strategy.

 The Target Data Architecture shall be mapped with


the overall Enterprise Architecture (EA).

 The Target Data Architecture shall adopt a popular


Enterprise Architecture Framework, e.g., TOGAF,
Zachmann.

 The regular mechanisms of monitoring & updating Data


Models.

Checklist – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - The DAM Plan - The Entity must attach a report clarifying the implementation status
Activate Implementation Progress including the following, as a minimum:
d Report.
 The achievement percentages of the initiatives & projects
included in the DAM Implementation Plan.

- The EA Framework - The Entity must attach a proof of implementing the approved EA Framework for
Implementation Report. the Data Architecture.

Level 4: - The Monitoring Report of - The report must be prepared based on pre-defined KPIs (Indicator Cards),
Manage the DAM Plan & Activities and each indicator’s data or card should include the following, as a
d Implementation with Pre- minimum:
defined KPIs.
 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

177
 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator
value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Data Architecture - The Entity must attach a report showing that the Entity identified,
5: Plan's Continuous implemented and is monitoring the Data Architecture plan’s continuous
Pionee Improvement Report. improvement mechanisms including the following, as a minimum:
r
 The Documentation of the periodic Data Architecture reviews
and the documented results.

 The continuous improvement mechanisms.

Checklist – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Levels Acceptance Evidence Acceptance Criteria

- The Architecture Change - The Entity must attach a document clarifying a specific Architecture Change
Management Process. Management Process for reviewing, approving and implementing changes in
the Current & Target Data Architectures. The Architecture Change Scope shall
include the following, as a minimum:

 The requests for new DAM initiatives.

 The modifications on the Current State Architecture


documents of the existing initiatives.

- The Change Control - The Entity must attach a document clarifying the Change Control Process to
Document. be used in changing the Current & Target Data Architectures. The document
shall include the following, as a minimum:

 The scope of change.

 The impact assessment of the change.

 The procedures of the change.

 The roles & responsibilities.

 The data of change approvals.

 The data of the change requester.

- The Data Architecture - The Entity must attach a report showing that the Entity incorporated Data
Checkpoints Report. Architecture Checkpoints in the Software Development Lifecycle (SDLC)
processes. The Checkpoints shall include the following, as a minimum:

 Investigating the possibilities of reusing the existing Data


Architecture components to address the business
requirements.

178
 Validating the conformance of the created Data Models
with the Entity's Enterprise Data Model.

 Verifying if the project implies any change required to be


done on the Entity's overall Enterprise Data Model.

179
Checklist – Data Architecture and Modelling Domain

Has the Entity developed and implemented practices for Data Architecture & Modelling (DAM) activities (including Data Flows,
DAM.MQ.2
Data Models and Governance considerations)?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Current DAM Domain - The Entity must attach the following, as a minimum:
Establishing Activities.
 Proof of the current processes followed to manage the Data
Flow & DAM Practices.

 Documents which clarify the current (Conceptual &


Logical) Data Models.

 Documents which clarify the Data Lineages (of Data


Flows) being aligned with business requirements.

Level - A document containing - The Entity must attach a document clarifying the Business Processes on the
2: the Approved Business Data Architecture, identifying the roles & responsibilities to ensure
Define Processes on the Data compliance with DAM Policies and effective & systematic achievements of
d Architecture and any Data Architecture activities.
Related Data Flow.

- A document containing - The Entity must attach a document specifying the requirements for
the Big Data developing a Data Storage / Lake Environment using a vendor-neutral Big
Considerations Including Data Reference Architecture Framework (e.g., NIST) and incorporating Big
the Data Lake Data architecture components into its overall target Data Architecture
Requirements. design. The Data Storage / Lake requirements shall address the following,
as a minimum:

 Ingest – Ingesting and converting semi structured &


unstructured datasets into a structured form.

 Infrastructure – Networking, computing and storage


requirements needed to handle large, diverse formats of
data.

 Platform – A distributed storage solution providing

180
distributed computing / processing capabilities.

Checklist – Data Architecture and Modelling Domain

Has the Entity developed and implemented practices for Data Architecture & Modelling (DAM) activities (including Data Flows,
DAM.MQ.2
Data Models and Governance considerations)?

Levels Acceptance Evidence Acceptance Criteria

- A document containing - The Entity must attach a document specifying data processing considerations
the Data Processing including the following, as a minimum:
Considerations Including
the Partitioning Strategy.  A partitioning strategy for its Target State Data Architecture
for efficient processing of various data volumes, data variety
& data velocity.

 The partitioning strategy coverage of both real time


and batch processing operations.

- Model representation. - Attach a document demonstrating the selection and documentation of a


planning method to document business structure, relationships, and code at
the conceptual, logical, and physical levels, and use it throughout the
software application development life cycle.

- The DAM Register. - The Entity must attach a Register clarifying the saving mechanism of the
following:

 The Data Architecture & Technical Architecture project.

 The reference documentations / materials.

 Data Model designs.

- A document containing - The Entity must attach a document clarifying that the Entity adhered to a
the Data Model standardized method in building Data Models according to best practices
Representation's (Such as naming conventions, data types, basic attributes, physical model
Technical Standards & deployment considerations, improvements, etc.).
Best Practices.

Level 3: - The Data Integration Pattern - The Entity must attach the documents of the Data Integration Pattern
Activate Implementation Document. diagramming (implemented within the Data Architecture).
d

181
- Evidences of DAM Tools & - The Entity must choose & implement a set of technologies to design,
Technologies in Use. develop and execute the Entity’s DAM initiatives. The set of technological
tools shall include, as a minimum:

 Data Architecture Design - Visually representing data and


system components along with data flow diagramming.

 Data Modeling - Drawing functionality to create & modify


data and system objects, attributes and relationships, and
reverse-engineer the existing data models.

 Data Lineage - Capturing and maintaining data flows between


systems to enable an impact analysis.

Checklist – Data Architecture and Modelling Domain

Has the Entity developed and implemented practices for Data Architecture & Modelling (DAM) activities (including Data Flows,
DAM.MQ.2
Data Models and Governance considerations)?

Levels Acceptance Evidence Acceptance Criteria

- Evidences of the - The Entity must attach a proof that the Data Catalog has been updated based
Enterprise Data Model on new or updated Data Models.
Uploaded in the Data
Catalog.

- Evidences of Applying the - The Entity must attach a picture proving applying Technical Standards on the
Technical Data Standards. Data Representation Models which were pre-defined in the Data Model
Representation’s Technical Standards Document.

Level 4: - The Monitoring Report of - The report must be prepared based on the data of the Key Performance
Manage the Implementation of Indicators (KPIs) (Indicator Cards) which were pre-defined to measure the
d the DAM Practices with Entity’s Performance in implementation DAM practices, and each indicator’s
Pre-defined KPIs. data or card should include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

182
 Measurement Periodicity (Monthly / Quarterly / Biannually /
Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - The Entity must attach a report containing the following, as a minimum:
5: Improvement Report
Pionee Including the DAM  The documentation of identifying, implementing and
r Business Process monitoring the Continuous Improvement Mechanisms of the
Documentation & Other DAM practices with the relevant Business Processes.
Related Data Flow
 The other Data Flow documentations.
Documentation.

183
8.2.7. Data Sharing & Interoperability Domain

Checklist – Data Sharing and Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Names of the - The Entity must attach a list of the names of the current DSI Practices and
Establishing Current Practices of Activities, including an evidence to support the practice of each Activity, being
Data Sharing and one of the following, as a minimum:
Integration (DSI).
 Email copies / screenshots.

 Document Copies.

 System screenshots.

- The Updated and - The Entity must attach the report of the Data Integration assessment results,
Approved Results of including the following, as a minimum:
the Initial Data
Integration  The current IT Structure: an inventory of all existing IT
Assessment. components (data sources, systems, applications and data
stores).

 High Level Data Lineage including the rules according to


which data is changed, and the frequency of changes.

 Data Models used by the Entity's IT components.

Level - The Developed Data - The Entity must attach the developed Data Integration Strategy containing the
2: Integration Strategy following, as a minimum:
Define Document.
d  The Strategy’s approval and effective date.

 The Strategy must be recent (Approved within the last three years).

 The objectives, initiatives, projects and the metric indicators


of the implementation of the Data Integration activities and
milestones. The activities shall incorporate what is needed
to achieve this Domain’s specifications, as a minimum.

 The implementation roadmap of the Data Integration initiatives.

 The assignment of the required resources & budget allocation to


manage the implementation of this Domain.

184
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Levels Acceptance Evidence Acceptance Criteria

- The Developed Target - The Entity must attach the Target Data Integration Architecture including the
Data Integration following, as a minimum:
Architecture.
 The Data Integration requirements.

 The Data Integration Architecture diagram.

 The Architecture components.

- The Developed Data - The Entity must attach the developed Data Integration Plan including the
Integration Plan following, as a minimum:
(Including Data Sharing
Activities).  The summarized Architecture briefs.

 The roadmap and covenant charters of the projects and


initiatives in the plan.

 The assignment of the required resources & budget


allocation for the projects and initiatives in the plan.

185
- The Developed - The Entity must attach the Policies of the Data Sharing & Interoperability
Data Sharing (DSI) Domain including the following, as a minimum:
Policies.
 The developed policies must be aligned with the National Data
Governance Policies published by NDMO-SDAIA.

 Every Policy document must include, the following, as a minimum:

 Policy title / name.

 Policy owner.

 Release date.

 Version number.

 Version history.

 Objective.

 Identified stakeholders.

 Communications process.

 Monitoring procedure.

 Document control (Preparation, review, approval).

 Policy Statement.

 Roles & responsibilities.

 Terminology.

 Scope of work.

 Activation mechanisms.

Checklist – Data Sharing and Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Levels Acceptance Evidence Acceptance Criteria

 Target audience.

 Approval.

 References.

- The Developed Data - The Entity must attach the developed and approved DSI Domain’s training and
Sharing Training Plan. awareness plan including the following, as a minimum:

 The training and awareness programs plan implementation dates.

 The objectives of training and awareness including, as a


minimum, the topics stated in the “Data Management and
Personal Data Protection (DM & PDP) Standards” document i.e.:

 Introduction on the applicability of the Data Sharing process.

186
 The leading / best Data Sharing practices.

 The consequences of mishandling Data.

 The Data Sharing Standards, Controls & Principles.

 The methods and channels through which the trainings will be


conducted.

 Identifying the DSI Domain Awareness campaign channels, which


include:

 E mails or mobile phone messages.

 Publications.

 Lectures or workshops.

Level 3: - A Data Integration - The Entity must attach an updated and approved report on the implementation
Activate Plan status of the DSI Plan, containing the following, as a minimum:
d Implementation
Status Report.  The activities related to the DSI Domain and the implementation
status of each activity.

- A Progress Report - The Entity must attach an updated and approved report clarifying the DSI
on Data Sharing training & awareness implementation status containing the following, as
Training Programs. a minimum:

 The training and awareness target audience.

 The list of conducted training programs and awareness campaigns


(Including the topics of the programs, the dates when they were
conducted, and the names of the training attendants).

 Samples of the activities performed by the Entity to raise


awareness about the DSI Domain, including:

 E mails or mobile phone messages.

 Publications.

 Lectures or workshops.

Checklist – Data Sharing and Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Levels Acceptance Evidence Acceptance Criteria

- A Report on the - The Entity must attach an updated and approved report presenting the DSI
Defined Roles and Domain’s roles and responsibilities including the following, as a minimum:
Responsibilities for
DSI.  The stewardship roles of the DSI processes.

 The stewardship responsibilities of the DSI processes.

Level 4: - The Monitoring Report - The Entity must attach an updated and approved report on monitoring the
Manage with Pre-defined Key DSI activities based on pre-defined KPIs (Indicator Cards), and each

187
d Performance indicator’s data or card should include the following, as a minimum:
Indicators (KPIs) for
the Data Integration  Indicator’s Name / Code.
Plan and Data Sharing
 Indicator’s Owner.
Activities.
 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - The Entity must attach an updated & approved report presenting the DSI
5: Improvement Report Domain’s continuous improvements including the following, as a
Pionee on the Data minimum:
r Integration Plan and
Data Sharing  The conducted reviews.
Activities.
 The results of each review.

 The implemented enhancements (with supporting documents).

188
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of the - The Entity must attach a report on the current DSI Practices and Processes,
Establishing Current DSI Practices both internally within the Entity and externally with other Entities, including an
/ Processes by the evidence to support the practice of each Activity / Process, being one of the
Entity (Internally & following, as a minimum:
Externally).
 Email copies / screenshots.

 Document Copies.

 System screenshots.

Level - The Process - The Entity must attach the Data Sharing Process documents with the
2: Documentation for Process details including the following, as a minimum:
Define Data Sharing
d (Including Data  Data Sharing Request Reception.
Classification Levels
 Identification / assignment of roles.
and Timelines).
 Data Classification (DC) level check.

 Data Sharing Principles assessment.

 Data sharing decision and replying with feedback.

 Business Data Executive’s approval.

 Design and implementation of Data Sharing Controls.

 Data Sharing Agreement signing.

 Sharing Data with the Requestor.

- The Developed and - The Entity must attach the Forms of the internal and external Data Sharing requests
Approved Data using the Entity’s approved templates.
Sharing Request
Forms (Internal and
External).

- The Developed and - The Entity must attach the internal Data Sharing Agreements between information
Approved Internal systems within the Entity.
Data Sharing
Agreement Template.

- The Developed and - The Entity must attach the external Data Sharing Agreements with other Entities
Approved External including the following, as a minimum:
Data Sharing
Agreement Template.  The Purpose of Data Sharing.

189
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Levels Acceptance Evidence Acceptance Criteria

 Information about each requesting and sharing Entity.

 Lawful / Legal / Regulatory basis for Sharing.

 Sharing details (Date, Duration, etc.).

 Liability provisions.

 The Data Sharing Agreements signed by the Business Data


Executive and the Requestor.

Level 3: - Evidences of - The Entity must attach a communication operationalization report related to
Activated Operationalization of the defined mechanism of the Data Sharing process, including the following,
The Data Sharing as a minimum:
Process, e.g.:
Communication of  Correspondence exchanged proving the establishment of a
The Defined Data communication channel between the communicating Entities.
Sharing Mechanism.
 A communication evidence of the approval of a formal
mechanism defined for Data Sharing.

- Evidences of Data - The Entity must attach a report on the electronic communication with the
Sharing Through requestor Entity requesting Data Sharing through the SDAIA certified and SDAIA
SDAIA's Certified and approved channels (e.g.: The Government Service Bus (GSB), etc.). The report
Approved Channels. may contain:

 System screenshots or records. (The communication channel


usage can include Data Sharing).

 A report on the communications using the Entity’s formal


government website.

- The Access - The Entity must attach a document detailing the Controls of authorization &
Authorization usage for accessing the official website.
Controls Document.

- Evidences of Data - The Entity must attach the record(s) containing the Data Sharing requests:
Sharing Requests
Submitted to The  The list received by the Entity.
Entity and The
Requests Submitted  The list sent / submitted by the Entity.
by The Entity
Through the
Established Channel.

- Evidences of the - The Entity must attach the notifications acknowledging receiving the Data
Entity's Responses to Sharing requests with the responses.
the Data Sharing
Requests.

Checklist – Data Sharing and Interoperability Domain

190
DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Levels Acceptance Evidence Acceptance Criteria

- An Adoption Evidence - The Entity must attach a report containing an evidence of the Entity's adoption of
of The Developed Data a template developed for the Data Sharing Agreement upon an internal request
Sharing Agreement from within the Entity.
Template for An
Internal Data Sharing
Request.

- An Adoption Evidence - The Entity must attach a report containing an evidence of the Entity's adoption of
of The Developed Data a template developed for the Data Sharing Agreement upon a request from an
Sharing Agreement external Entity.
Template for An
External Data Sharing
Request.

- The Documented - The Entity must attach a report on the results and outcomes of the documented
Review Outcomes of reviews of the Data Sharing Agreements.
The Data Sharing
Agreements.

Level 4: - The Monitoring Report - The Entity must attach an updated and approved report on monitoring the
Manage with Pre-defined KPIs Data Sharing Processes based on pre-defined KPIs (Indicator Cards),
d for the Data Sharing covering:
Processes.
 The number of Data Sharing requests received.

 The number of Data Sharing requests accepted / denied.

 The number of Data Sharing requests sent.

 The number of ongoing Data Sharing agreements.

 The average duration of the Data Sharing request evaluation


process expressed, in days.

- Each indicator’s data or card should include the following, as a


minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

191
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Levels Acceptance Evidence Acceptance Criteria

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

- The Compliance Audit - The Entity must attach the Compliance Audit Methodology document.
Methodology.

Level - The Continuous - The Entity must attach a continuous improvement report including the
5: Improvement Report following, as a minimum:
Pionee on the DSI Processes.
r  Evidence of DSI Process automation.

 The documents of the periodic reviews, evaluations &


documented results of the DSI Domain Processes.

Checklist – Data Sharing and Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across Data
DSI.MQ.3
stores, systems, and applications?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Data - The Entity must attach evidences of Data Movement or Data Integration /
Establishing Movement / Integration Interoperability:
within the Entity and
with Other Entities.  Internally within the Entity.

 Externally with other Entities.

Checklist – Data Sharing and Interoperability Domain

192
Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across
DSI.MQ.3
Data stores, systems, and applications?

Levels Acceptance Evidence Acceptance Criteria

Level - The Integration - The Entity must attach the Integration / Interoperability Requirements
2: Requirements document containing the following, as a minimum:
Define Document.
d  A clearly defined scope.

 Entity's Business goals / objectives and aims to be achieved.

 Implementation Timeline.

 Resources required.

 Cost estimate.

 Functional requirements.

 Non-Functional requirements.

- The Solution Design - The Entity must attach the Solution Design document including the
Document. following, as a minimum:

 Integration Solution Overview.

 Target Data Integration Architecture.

 Data Orchestration – The Data Flow Diagram (DFD).

 Source to Target Mapping – A set of Data Transformation


instructions that determine how to convert the structure and
content of Data in the source system to the structure and
content needed in the target system. The instructions shall
include the following, as a minimum:

 The technical format of Data at the source and the target.

 The Specifications of transformations required for all


intermediate staging points between the source and the
target.

Level 3: - The Document of the - The Entity must attach a life cycle description document, being the
Activated Approved standardized and approved cycle for the development of Integration /
Standardized Entity Interoperability solutions at the Entity level.
Wide Integration
Solution
Development
Lifecycle.

- The Developed Test - The Entity must attach the developed test scripts and the conducted tests
Scripts and the (Integration tests & functional tests) in alignment with the Plan and the
Conducted Tests Solution Design document, containing the following, as a minimum:
(Integration, Functional)
in Line with the Plan and  The defined & identified Test Use Cases.
the Solution Design
 Evidence of a Test Environment setup.
Document.
 The Test Use Cases, executed in a Test Environment,
and the documented test results.

193
Checklist – Data Sharing and Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across
DSI.MQ.3
Data stores, systems, and applications?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report - The Entity must attach an updated and approved report on monitoring the Data
Manage with Pre-defined KPIs Integration
d for the Data Integration / Interoperability initiatives based on the data of the KPIs (Indicator Cards) which
Initiatives. were pre- defined for the DSI Domain, covering:

 Data transfer rate between systems / applications.

 Latency between Data sources and Data targets.

- Each indicator’s data or card should include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

- The Integration Solution - The Entity must attach a document on monitoring and maintaining the
Monitoring and Integration / Interoperability solutions, containing the following, as a
Maintenance minimum:
Document.
 Reports of any detected programming flaws or errors.

 Change requests from end users to incorporate changes in


business requirements.

Level - The Continuous - The Entity must attach an updated & approved report including the following
5: Improvement Report on for the Data Integration / Interoperability Practices, as a minimum:
Pionee the Data Integration
r Practices.  The documents of the periodic reviews & documented results.

 The continuous improvement mechanisms.

194
Checklist – Data Sharing and Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across Data
DSI.MQ.3
stores, systems, and applications?

Levels Acceptance Evidence Acceptance Criteria

- The Continuous - The Entity must attach a document clarifying the details of the automation
Improvement activities program (Pipeline) in relation to Continuous Integration & Continuous
Mechanisms for Data Delivery (CICD).
Integration, e.g.:
Continuous Integration
& Continuous Delivery
(CICD) Pipeline Details
for Automation.

Checklist – Data Sharing and Interoperability Domain

Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation
DSI.MQ.4
and movement?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing Data - The Entity must attach a report on the current Data Integration /
Establishing Integration or Data Interoperability Processes or the current Data Movement Processes within the
Movement Processes Entity.
within the Entity.

Level - The Developed Data - The Entity must attach a report on the Processes and Standards process to
2: Migration Processes integrate data from disparate sources and load it into the Data Warehouse
Define (i.e.: ETL). Store i.e.: Extract, Transform, Load (ETL).
d

- The Developed Data - The Entity must attach a report on the Processes and Standards process to
Migration Processes store the unstructured data in its raw native format in the Data Lake i.e.:
(i.e.: ELT). Extract, Load, Transform (ELT).

- The Risk Assessment - The Entity must attach an assessment report on the risks which may arise as a
Report on the Entity's result of sharing the Entity’s Datasets.
Datasets to be Shared.

Checklist – Data Sharing and Interoperability Domain

195
Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation
DSI.MQ.4
and movement?

Levels Acceptance Evidence Acceptance Criteria

- The Defined Data - The Entity must attach a report on the Controls that have been identified and
Sharing Controls. defined for Data Sharing.

Level 3: - Evidence of the - The Entity must submit a report to showcase the Implemented “Extract,
Activate Implemented Data Transform, Load (ETL)” process for integrating data from disparate sources
d Migration Processes and loading into Data Warehouse Store. The process must include the
(i.e.: “Extract, following steps:
Transform, Load (ETL)).
 Extract – Data Extraction must include the following, as a
minimum:

 Identification of Data Sources from which the Data will be


extracted.

 Extracting Data from the Data Sources.

 Staging the extracted Data temporarily in a physical Data


store, e.g.: On disk or in another memory.

 Transform – The Data Transformation step must include the


following, as a minimum:

 Data Mapping – Planning the actual transformation process.

 Data Transformation – Removing duplicate Data, filling out


missing values, filtering, sorting, joining and splitting
Data.

 Review – Validating the correctness of the transformation.

 Load – Physically storing the transformed Data in the Data


Warehouse / Store.

- Evidence of the - The Entity must submit a report to showcase the Implemented “Extract, Load,
Implemented Data Transform (ELT)” process for storing unstructured data in its raw native
Migration Processes format in the Data Lake. The process must include the following steps:
(i.e.: “Extract, Load,
Transform (ELT)).  Extract – The Data Extraction step must include the
following, as a minimum:

 Identification of Data Sources from which the Data will be


extracted.

 Extracting Data from Data Sources.

 Loading – Physically storing Data in its raw native format in


the Data Lake.

 Transform – The Data Transformation step must include the


following, as a minimum:

 Data Mapping – Planning the actual transformation process.

Checklist – Data Sharing and Interoperability Domain

196
Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation
DSI.MQ.4
and movement?

Levels Acceptance Evidence Acceptance Criteria

 Data Transformation – Removing duplicate Data, filling out


missing values, filtering, sorting, joining and splitting
Data.

 Review – Validating the correctness of the transformation.

- A Report on the - The Entity must attach a report clarifying the Controls implemented on
Implemented Controls the Migration Processes “ETL or ELT” (e.g., Controls for: Data Sharing,
(e.g.: Data Security & Data Integration / Interoperability, and Data Access Authorizations, etc.),
Protection, Data including the following, as a minimum:
Sharing, Data
Integration, Data  Attaching an evidence that all stakeholders involved in Data
Access, etc.). Sharing applied the appropriate Security Controls to protect and
share Data, in a safe and reliable environment, in alignment
with the relevant laws and regulations, and as issued by the
National Cybersecurity Authority (NCA).

 Attaching an evidence that all stakeholders involved in Data


Sharing have the following, as a minimum:

 The authorization to view, obtain / acquire and use this


Data (which may require security scanning depending on
the nature and sensitivity of the Data based on the Data
Classification (DC) Domain Standards).

 Knowledge, skill, and qualified people who are properly


trained on handling the Shared Data.

 Attaching an evidence that all stakeholders involved in Data


Sharing applied the controls which are necessary to
appropriately manage and protect the Shared Data, including
what is stated in the National Data Governance (DG) Policies
issued by SDAIA, e.g.:

 Legal / Regulatory basis.

 Delegation / Authorization.

 Data type.

 Data pre-processing.

 Data Sharing methods.

 Data usage & maintenance.

 Data Sharing duration.

 The number of times Data was Shared.

 Sharing cancellation.

 Liability provisions.

Checklist – Data Sharing and Interoperability Domain

Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation
DSI.MQ.4
and movement?

Levels Acceptance Evidence Acceptance Criteria

197
Level 4: - The Monitoring Report - The Entity must attach an updated and approved report on monitoring the DSI
Manage with Pre-defined KPIs for Controls, Practices and Standards based on pre-defined KPIs (Indicator Cards),
d the DSI Controls, and each indicator’s data or card should include the following, as a minimum:
Processes and
Standards.  Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - The Entity must attach an updated and approved report on reviewing the
5: Improvement Report on processes and mechanisms implemented to automate the DSI Domain’s
Pionee Reviewing the DSI Controls, Standards, and Practices, including the following, as a minimum:
r Processes and
Mechanisms to  The documents of the periodic reviews & documented results.
Automate the DSI
 The continuous improvement mechanisms.
Controls and Practices.

198
8.2.8. Reference and Master Management Data Domain

Checklist – Reference and Master Data Management Domain

Has the Entity developed and implemented a plan focused on improving its Reference & Master Data (RMD)
RMD.MQ.1
Management capabilities?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Currently Existing - A report must be attached clarifying the current practices related to Reference
Establishing Practices Related to Data management and/or Master Data management.
the RMD Management
Domain.

Level - The Developed and - An RMD Management Plan must be attached including the following, as a
2: Approved RMD minimum:
Define Management Plan.
d  A roadmap that includes the activities and milestones for the
Entity’s RMD Management. The activities shall incorporate
what is needed to achieve this Domain’s specifications, as a
minimum.

 The assignment of the required resources & budget


allocation to manage the implementation of the
roadmap activities.

Level 3: - The RMD Management - A report must be attached clarifying the implementation status including
Activated Plan Implementation the following, as a minimum:
Status Report.
 The achievement percentages of the initiatives and projects
included in the executive action plan of RMD management.

- The RMD - A report must be attached clarifying the implementation status including
Management the following, as a minimum:
Training
Implementation  Attach samples of the activities performed by the Entity to
Status Report. raise awareness in the RMD Management Domain
(Awareness messages, publications, lectures or workshops).

 Attach a sample attendance certificate of training the


Entity’s employees in the RMD Management Domain.

199
- The RMD - A document must be attached clarifying the RMD Management
Management Operating Model including the following, as a minimum:
Operating Model
Showing the RMD  Assigning Business Data Stewards.
Stewardship
 Assigning IT Data Stewards.
Coverage.

Checklist – Reference and Master Data Management Domain

Has the Entity developed and implemented a plan focused on improving its Reference & Master Data (RMD) Management
RMD.MQ.1
capabilities?

Levels Acceptance Evidence Acceptance Criteria

- The RMD Change - A consolidated log must be attached containing the following, as a minimum:
Request Logs.
 The RMD change requests.

 The decisions made on the RMD change requests.

- The RMD Management - The planning documents & supporting artifacts of the RMD Management
Documents & initiatives must be attached including the following, as a minimum:
Artifacts.
 The register of the Architecture status of all RMD initiatives
(e.g., The Statement of Architecture Work (Scope)
document).

Level 4: - The Monitoring Report - A monitoring report must be attached, prepared based on KPIs (indicator
Manage of the RMD cards) data pre-defined in the RMD Management Plan. The data of each
d Management Plan indicator or each indicator card must include the following, as a minimum:
Implementation with
Pre-defined KPIs.  Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

200
Level - The Continuous - A report must be attached clarifying that the Entity identified,
5: Improvement Report implemented and is monitoring continuous improvement mechanisms
Pionee of the RMD for the RMD Management Plan including the following, as a minimum:
r Management Plan.
 The documents of the periodic reviews & documented results
of RMD Management.

Checklist – Reference and Master Data Management Domain

Has the Entity defined and implemented processes to manage its Reference & Master Data (RMD) objects from creation to
RMD.MQ.2
archival?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidence of Projects / - An evidence must be attached proving the implementation of any existing
Establishing Initiatives with project or current initiative that involves identified Reference Data and/or
Reference Data and/or Master Data Objects.
Master Data Identified.

Level - The Identified, - Documents must be attached clarifying that the Entity identified,
2: Prioritized and prioritized & categorized the RMD including the following, as a
Define Categorized RMD minimum:
d Objects.
 Identifying & Prioritizing the RMD:

 The identified Master Data objects (internal & external).

 The identified Reference Data objects (internal & external).

 The identified Data sources and applications that


generate, read, update, and delete RMD Objects.

 The logically grouped & prioritized RMD Objects.

- Reference Data - Attach documents showing that the entity has categorized the reference
Categorization. data. These documents include, as a minimum, the following:

 Internal Reference Data.

 External Reference Data.

- Master Data - Attach documents showing that the entity has categorized the reference
Categorization. data. These documents include, as a minimum, the following:

 Internal Master Data.

 External Master Data.

201
Checklist – Reference and Master Data Management Domain

Has the Entity defined and implemented processes to manage its Reference & Master Data (RMD) objects from creation to
RMD.MQ.2
archival?

Levels Acceptance Evidence Acceptance Criteria

- The Reference & Master - Evidence documents must be attached clarifying that the Entity
Data Requirements. collected the RMD requirements including the following, as a minimum:

 RMD Management roles across the Data lifecycle from


creation to archiving.

 Rules for accurate matching and merging of the Master Data


Records from different Data sources to create Golden
Records.

 Requirements for provisioning Master Data Golden


Records to consuming applications.

 Requirements for provisioning Reference Data Objects to


consuming applications.

 Data Quality (DQ) requirements for RMD Objects to be


leveraged as input for the Initial Data Quality Assessment
detailed in the Data Quality (DQ) Domain.

- The Defined SLAs for RMD - A document should be attached clarifying the RMD lifecycle
Lifecycle Management. management process SLAs including, as a minimum:

 The RMD request implementation time frame.

 The escalation procedures to be done upon SLA violation.

Level 3: - The RMD Lifecycle - A document must be attached clarifying a process for managing RMD
Activated Management Objects across the Data Lifecycle from creation to archival / disposal. The
Process. process must cover roles and procedural actions involved in the following
Data Lifecycle steps, as a minimum:

 Creating new RMD Objects & Instances.

 Modifying existing RMD Objects & Instances.

 Archiving RMD Objects & Instances.

- Evidence of the - Attachment of the shared reference datasets with SDAIA.


Implementation &
Adoption of the - Attachment that proves the processes for transferring the reference data.
National Reference
Datasets.

Level 4: - The Monitoring Report - A monitoring report must be attached, prepared based on KPIs (Indicator
Manage of the RMD Cards) pre-defined for the RMD Management Processes, and based on the
d Management Processes SLAs, including measuring the following indicators, as a minimum:
with Pre-defined KPIs &
SLAs.  The number of incorrect Data values in the Master Data
Records.

 The Mean Time to Repair (MTTR) RMD quality issues.

 The volumes of change requests submitted to modify RMD

202
Objects.

Checklist – Reference and Master Data Management Domain

Has the Entity defined and implemented processes to manage its Reference & Master Data (RMD) objects from creation to
RMD.MQ.2
archival?

Levels Acceptance Evidence Acceptance Criteria

- Each indicator’s data or card should include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - A report must be attached clarifying that the Entity identified, implemented
5: Improvement Report of and is monitoring continuous improvement mechanisms for the RMD
Pionee the RMD Management Management Processes including the following, as a minimum:
r Processes & Standards.
 The documents of the periodic reviews & documented
results of the RMD Management Processes.

Checklist – Reference and Master Data Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Levels Acceptance Evidence Acceptance Criteria

203
Level 0: - Not Applicable. - Not Applicable.
Absence of
Capabilities

Checklist – Reference and Master Data Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Levels Acceptance Evidence Acceptance Criteria

Level 1: - Evidence of - An evidence must be attached for existing Reference Data and/or Master Data
Establishing Reference & Master (Tool sample: Excel) used for the purposes of a specific, current project.
Data (RMD) Used for
Project Purposes.

Level - The Target RMD - A document must be attached illustrating that the Entity designed a
2: Management RMD Hub to efficiently manage the RMD Objects including, as a
Define Architecture Design. minimum:
d
 Identifying a Data Hub architecture implementation
pattern to manage the Master Data Objects.

 The design considers supporting centralized management


of Reference Data – The Hub is the single source of
Reference Data, with creation and modification
operations performed exclusively within the Hub.

- The Developed RMD - The target RMD environment Conceptual Architecture must be attached as
Conceptual Architecture. per the Entity's selected Data Hub architecture design; indicating
foundational building block components and high-level capabilities
associated with the components. The conceptual RMD architecture must
consist of the following, as a minimum:

 Architecture Description – A description of the overall


architecture concept defined.

 Components Definitions and Descriptions – Building blocks


(The Data Hub, Data sources, consuming applications, etc.)
of the RMD Conceptual Architecture with descriptions of their
purposes.

 The General Conceptual Architecture Diagram – A high-level


view of how components work together to address the
Entity's RMD requirements.

204
- The Developed RMD - A document must be attached clarifying an Information Architecture for the
Information Architecture. target RMD environment based on the defined Conceptual Architecture. The
Information Architecture must represent the following components, as a
minimum:

 RMD Objects – An inventory of the identified RMD Objects


including Metadata definitions.

 Conceptual & Logical Master Data Model – A conceptual &


logical Data model for the identified Master Data Objects
and their relationships.

 RMD Sources – An inventory of the identified RMD sources.

 Rules for Matching & Merging Master Data Records from


different Data sources to create Golden Records.

 RMD Flows.

Checklist – Reference and Master Data Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Levels Acceptance Evidence Acceptance Criteria

- The RMD Hub / Tool - A document must be attached defining, identifying & documenting the
Technical technical requirements for the RMD Hub platform based on the defined
Requirements. target RMD Information Architecture. The requirements must cover the
following areas, as a minimum:

 Management of Workflows:

 Creating & modifying RMD Records.

 Assigning Master Data Management (MDM) Stewardship.

 Versioning Control - Tracking changes in RMD Records over


time.

 Functional Capabilities – Detailing the functional capabilities


required from the Hub (e.g.: Import, export, Data mappings,
automation of operational tasks around collection,
cleansing, etc.).

 Technical Capabilities - Detailing the technical capabilities


required from the Hub (e.g., API Integration with upstream
& downstream applications and systems).

 Security - Supporting secure Data exchange between the


Hub and the connected applications / Data sources.

Level 3: - The Implemented RMD - A document must be attached providing evidence that the Entity
Activate Management Hub. implemented an integrated Hub for managing its RMD Objects to be
d considered as the trusted source of RMD across the entire Entity. The
document must provide evidences for the following performed tasks, as a
minimum:

 The instantiated physical Data Hub Technical


Architecture components necessary to address the
Entity's target RMD Information Architecture
requirements.

205
 The established Master Data Model as defined by
the RMD Information Architecture within the Data
Hub.

 The loaded RMD Objects into the Data Hub.

 The necessary replication activated between Master Data


Source systems and the Data Hub.

 Synchronization activated between the Data Hub and the


consuming applications.

- The Workflow - An evidence must be attached proving that the Data Hub is considered a
Documentation reliable source for each information system or new application that
Showing the requires the use of special RMD Objects (for each program).
Establishment of the
Data Hub as the
Entity's Trusted
Source.

Checklist – Reference and Master Data Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report - A monitoring report must be attached, prepared based on KPIs
Manage of the RMD (Indicator Cards) pre-defined to monitor the RMD Management Hub /
d Management Hub Tool capabilities, and each indicator’s data or card should include the
/ Tool Capabilities with following, as a minimum:
Pre- defined Key
Performance Indicators  Indicator’s Name / Code.
(KPIs).
 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - A report must be attached clarifying that the Entity identified,

206
5: Improvement Report of implemented and is monitoring continuous improvement mechanisms
Pionee the RMD Information for the RMD Hub & Architecture including the following, as a minimum:
r Architecture and the
Implemented Data Hub.  The documents of the periodic reviews & documented
results of the RMD Hub including the Information
Architecture.

207
8.2.9. Business Intelligence and Analytics Domain

Checklist – Business Intelligence and Analytics Domain

Has the Entity developed and implemented a plan to manage and orchestrate its Business Intelligence & Analytics (BIA)
BIA.MQ.1 activities?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Current activities related - A document must be attached, covering:


Establishing to BIA.
 Current activities related to BIA (reports and/or dashboards, etc.).

Level - The defined & approved - An approved BIA plan must be attached, and the plan must include the
2: BIA Plan. following, as a minimum:
Define
d  A roadmap of activities and milestones for Data Analytics &
Artificial Intelligence (DA & AI) use cases. The activities
should include what is necessary to achieve this Domain’s
specifications, as a minimum.

 The allocation of the required resources & budget to


manage the implementation of the DA & AI use cases.

Level 3: - The BIA plan & roadmap - A status report of the BIA Plan & roadmap implementation must be
Activated implementation status attached, which includes as a minimum:
report.
 The achievements percentages of the initiatives & projects
included in the plan & roadmap.

- The Defined & - A document must be attached showing the roles & responsibilities of the
Documented Roles & Data team members working on the BIA Activities’ Implementation
Responsibilities for BIA including, as a minimum:
Activities Including Data
Stewardship Roles.  The defined & documented roles & authorizations of all data
specialists, including the Data Stewards & Data Owners.

Level 4: - The Effectiveness Monitoring - A monitoring report must be attached showing the effectiveness of BIA plan &
Manage Report of the BIA plan & activities & must be prepared based on pre-defined KPIs, and each indicator’s
d activities with pre-defined data or card should include the following, as a minimum:
Key Performance Indicators
(KPIs).  Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic/operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

208
Checklist – Business Intelligence and Analytics Domain

Has the Entity developed and implemented a plan to manage and orchestrate its Business Intelligence & Analytics (BIA)
BIA.MQ.1 activities?

Levels Acceptance Evidence Acceptance Criteria

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - Regular reviews and - A report of the BIA plan & roadmap must be attached including the
5: improvements to the following, as a minimum:
Pionee business intelligence and
r analytics plan and  Documented periodic reviews and results including the
roadmap. updated plan and updated roadmap.

 BIA continuous improvement Mechanisms.

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - List of the implemented - The document must include a list of the use cases implemented within the
Establishing BIA uses cases within the Entity, and for each use case included in the specified list, the following must
Entity. be documented, as a minimum:

 The name of the use case.

 The description of the use case.

 The desired & the achieved objectives of the use case.

Level - The approved BIA - A document containing the following must be attached:
2: business cases / BIA use
Define cases.  The list of approved BIA Business Cases.
d

209
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

 The list of approved BIA use cases.

- For each business case / use case in the identified lists, the following
must be documented, as a minimum:

 The name of the case.

 The description of the case.

 The stakeholders of the case.

- The approved BIA Use - A document must be attached that includes the approved framework for
Cases Prioritization shortlisting / prioritizing BIA use cases, and the document must include the
Framework. following, as a minimum:

 Prioritization Identification Criteria: A list of criteria used to


prioritize use cases such as strategic alignment, potential
impact, complexity, and resource requirements.

 Weighting Criteria: An explanation of how each criterion is


weighted and how weighting is specified. This can be based on
opinions of experts, inputs from stakeholders, a data-driven
approach, etc.

 Scoring Mechanism: A description of how scores are


calculated to weight & specify use case priorities.

- The shortlisted / - A document must be attached, listing the high-priority use cases based on the
prioritized use cases pre-approved prioritization framework. For each use case in the prioritization
based on business needs. list, the following must be documented, as a minimum:

 The name of the use case.

 The description of the case.

 The priority of the case.

- Note, if there is a single document containing the approved use cases, and / or
approved framework, and/or prioritized use cases, this evidence can be
attached to this requirement and the previous two requirements.

- The BIA Use Case Portfolio - A portfolio document of BIA use cases must be attached, and the following
document with details of details must be documented for each use case, as a minimum:
each use case.
 The desired & achieved objectives of each use case.

 Type of Analytics leveraged (among the five Analysis-Maturity


levels, which are Discovery, Descriptive, Diagnostic,
Predictive, Prescriptive).

 The expected benefits & business value aimed to be derived


(Return on Investment (ROI) through the development of each
210
business case.

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

 Stakeholders & Entities involved in the implementation of the


use case, the responsible 'owner' who’d lead the use case,
and the target consumers who’d benefit from the insights
generated by the use case.

 A list of business requirements needed to implement the use


case.

 Data sources that would feed the use case, with the required
data fields.

 The technologies required to implement the use cases.

 Case priority.

- The Approved Use Case - An approved BIA use case implementation plan must be attached including:
implementation plan.
 The implementation plan for each shortlisted / prioritized &
approved Use Case, with the succession order of the
implementation steps from the use case trial / piloting phase,
getting through the production phase toward continuous result
monitoring. The implementation plan shall address the
following, as a minimum:

 To detail Functional & Non-Functional Requirements –


Use case objectives translated into analytics
requirements.

 High Level Design – Conceptual design of the analytics


solution, (e.g., Wireframes).

 Staging & Production Environment Preparations – Analytics


solution hosting environments during and after
development.

 Development – Functional & non-functional


requirements to be developed to meet the high-level
design.

 Testing – Scopes & types of testing that must be conducted.

 Deployment & Schedule – Timeline for establishing a pilot


and / or delivery of the complete use case.

 Required Resources – The Entity’s key personnel who have


the needed skills, expertise & knowledge to successfully
implement the Data Analytics Use Case.

 Acceptance Criteria – Key criteria for measuring the


successful implementation of the Data Analytics use
case.

211
- The defined use case - A Use Case Implementation methodology document must be attached
implementation approach including the following, as a minimum:
(e.g. DevOps, Agile, etc.).
 The approach/method’s general description.

 Key practices & tools in the approach.

 The roles & responsibilities of the participants in the approach.

- The approved use case - A validation process document must be attached showing how to validate use
validation process. case outcomes, stating the initial intended purpose, and explaining the
alignment with the

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

Entity's overall Data Analytics Plan. The validation of use cases shall address
the following, as a minimum:

 Analytics Use Case functional and non-functional requirements.

 Analytics Use Case Personal Data Protection (PDP)


considerations, as prescribed in the Privacy / PDP Domain.

 Data Analytics Use Case Return on Investment (ROI) for each


target set.

Level 3: - The Implemented and - A document must be provided that includes a list of implemented use cases,
Activate new Use Cases. as well as another list of new use cases that have been evaluated and
d implemented according to business requirements. For each use case in the
specified list, the following must be documented, as a minimum:

 The name of the use case.

 The description of the case.

 The priority of the case.

- The up-to-date BIA use - An up-to-data register of use cases must be attached showing the
cases register. following, as a minimum:

 The Register’s Releases (with the dates).

 BIA use cases.

 The Results of the final reviews of each use case implementation.

 The stakeholders in the Entity who can view the register.

212
- The Outcomes of the BIA - A document must be attached showing the outcomes of the BIA use case
Use Case Validation validation activities, and it must include the outcome of each use case
Processes. validation process, stating the initial intended purpose, and explaining the
alignment with the Entity's overall Data Analytics Plan. The use case
validation outcomes document must include the following, as a minimum:

 Analytics Use Case functional and non-functional requirements.

 Analytics Use Case Personal Data Protection (PDP)


considerations, as prescribed in the Privacy / PDP Domain.

 Data Analytics Use Case Return on Investment (ROI) for each


target set.

Level 4: - The Monitoring Report on - A monitoring report on the effectiveness of the BIA portfolio must be
Manage the BIA Portfolio attached, including the following KPIs, as a minimum:
d Effectiveness with pre-
defined Key Performance  The number of the identified / shortlisted use cases.
Indicators (KPIs).
 The number of use cases in the pilot / beta phase for testing.

 The number of the implemented use cases that are widely


utilized.

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

 The total Return on Investment (ROI) value generated from


use case implementations.

 Other potential Key Performance Indicators (KPIs) include


(but are not limited to):

 Accuracy in achieving desired outcomes from analytical


models.

 Number of positive impacts resulting from implementing


use cases on business aspects, such as process
improvement or enhancing user experience.

 Number of positive impacts resulting from implementing


use cases on technical aspects, such as increased
adherence to technical standards.

 Number of positive impacts resulting from implementing


use cases on strategic decisions.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic/operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

213
 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly /


Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity: Lower
indicator value is the target).

Level - Continuous Improvement - An updated report must be attached on the Continuous Improvement of
5: Report on BIA Use Cases. the BIA Use Cases including the following for each use case, as a
Pionee minimum:
r
 Potential Impact.

 Competitive Advantage.

 Total Cost of Ownership (TCO) Analysis:

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

 Analyzing all expenses over the life of the project or per task.

 The description of the updated TCO calculation


methodology, with any assumptions or estimations
applied during the calculation.

 Return on Investment (ROI) Analysis:

 Analyze all benefits achieved such as revenue increases,


cost savings, or productivity gains, in comparison with the
expenses per project or per task.

 The description of the updated ROI calculation


methodology, with any assumptions or estimations
applied during the calculation.

- The Updated BIA Use - An updated use case record must be attached including the following, as a
Case Register and the minimum:
Optimized Use Cases.
 The Revised & improved use cases.

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

214
Level 0: - Not Applicable. - Not Applicable.
Absence of
Capabilities

Level 1: - The Existing BIA - A document must be attached covering the following, as a minimum:
Establishing Domain Processes &
Governance  The details of the current BIA Processes.
Documentation.
 The current BIA Governance documents.

- The List of Existing Reports - A document must be attached covering the following, as a minimum:
& Dashboards.
 The current Reports.

 The current Dashboards.

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

Level - The Developed & - A document containing details of the developed and approved processes for
2: Approved Processes of managing business intelligence and analytics should be attached, which
Define BIA Management. should include, at a minimum, the following:
d
 Data warehouse or data lake processes with logical models for
modeling business functions.

- The Approved Process of - A document must be attached containing the approved process management
New Data Source procedures of identifying, evaluating, and approving new sources of data to be
Requirements. used in BIA applications. The document must include, as a minimum:

 Data Source Identification: This includes identifying new


sources which may have processes that are related to the
Entity's BIA needs, e.g., External data sources or internal data
sources from units within the Entity.

 Data Source Evaluation: This involves evaluating new sources


which are proposed based on factors such as Data Quality
(DQ), alignment with the business objectives, and cost-benefit
considerations.

 Prioritizing Data Sources: This includes giving priority to


proposed new sources based on the importance of their data
and their impact on the Entity’s BIA activities.

 Data Source Approval / Agreement: This includes


obtaining the necessary stakeholder approvals, e.g.,
Data Owners, IT Teams & Business Leaders.

215
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

- The Approved Demand - The approved process document of BIA Demand Management must be
Management Process. attached showing the following:

 Demand Analysis: The process of analyzing the BIA service


demand, including request volume, frequency, and
complexity.

 Request Prioritization: The process of prioritizing BIA service


requests, including the criteria that will be used to prioritize
each request.

 Resource Allocation: The process of allocating resources to


support the BIA service demand.

 Communication: The process of communicating the BIA service


demand status to the stakeholders, and about the performance
of the Demand Management process.

 Governance & Compliance: The Governance & Compliance


process to implement this approved process for BIA service
Demand Management.

- Development and - The document for the development and maintenance of the semantic layer in
maintenance document of the business intelligence and analytics solution should be attached. It acts as
the Semantic Layer. a bridge between data sources and end users by providing a logical
representation of the data to facilitate understanding and usage. The
document should include, at a minimum, the following:

 Data Source Mapping: The mapping between Data Sources


and the Semantic Layer, i.e., How the data from each source
is mapped on to the Semantic Layer, including any
transformations or collations performed.

 Business Logic: The Business Logic is applied to the data


in the Semantic Layer. This shall include Entity-specific
definitions of key business concepts.

 Security: This section describes how security is


implemented in the Semantic Layer. It shall define the
security roles & permissions that apply to the data.

 Performance Optimization: The strategies used to enhance the


Semantic Layer’s performance. The strategies shall include
technologies such as indexing / cataloging, caching / temporary
storage, and clustering to improve query performance.

 Metadata: The Metadata associated with the Semantic Layer,


including data descriptions, and definitions of business
concepts.

216
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

 User Interface: The components of the User Interface used to


access and interact with the Semantic Layer, e.g.,
Dashboards, reports, and data discovery / exploration tools.

 Record and results of any maintenance performed on the


semantic layer.

- Advanced analytics - The document for the management and governance of advanced analytics
management and should be attached. The document must include, as a minimum:
governance process.
 The Advanced Analytics Governance Framework.

 The operations & the procedures of Data Management in the


Advanced Analytics context.

 The operations & the procedures of the Development


Management of the Models used in Advanced Analytics,
deployment, and maintenance.

 The security & compliance procedures in the context of


advanced analytics.

 The operational procedures of advanced analytics.

 The continuous improvement operations & procedures of


advanced analytics. This shall include procedures for
measuring the Advanced Analytics effectiveness to identify
improvement areas and implement the modifications.

- The Developed & - The developed & approved Change Management Plan (for training
Approved Change programs) must be attached including the following, as a minimum:
Management Plan
including the training  The details of BIA training to be conducted for all employees
programs. participating in BIA initiatives to raise the level of analytical
capabilities within the Entity. The training shall include the
following, as a minimum:

 Methods of collecting & organizing the data required


for the analysis.

 Model Development, Analysis Method


Implementation, and Analytical Tool Usage.

 The Development of Data Models and Data Flow Paths.

 The Types of Graphic Representations of Data & Information.

 The Analysis Models Evaluation Methods.

- The Developed & - The developed & approved Change Management Plan (for Awareness
Approved Change Campaigns) must be attached including the following, as a minimum:
Management Plan
including awareness  The details of the BIA Domain’s awareness campaigns to
campaigns. enhance familiarity, knowledge & usage of the Data

217
Analysis (DA) & Artificial
Intelligence (AI) capabilities. The Entity shall use one or more of the

218
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

communication channels (owned by the Entity) for awareness


campaigns to raise edification of the following:

 The Analytics Assets that the Entity currently owns –


previously implemented use cases, analysis models,
Application Programming Interfaces (APIs), BI reports, and
the monitoring dashboards utilized; all of the above for
probable data sharing and potential reuse.

 BIA Success Stories – Quantitative & qualitative


benefits and outcomes from recent use case
implementation activities.

 The new tools used in the BIA Domain and the technical
tool workflows in the Entity Especially those tools that
are based on emerging new technologies.

- The identified & - A document should be attached that includes, at a minimum, the following:
developed Data Sources
& Data Marts.  Data Sources: e.g., Systems, databases, or applications that
create, store, or manage data that will be used in BIA. They can
include Internal Data Sources, such as transactional databases,
data storages, and data lakes; in addition to External Data
Sources, such as social media communication methods, market
research reports, and publicly available datasets.

 Data Marts: Subsets / partitions of data warehouses designed


to serve particular business units or specific job functions.

Level 3: - Evidences of the adoption - A document must be attached including the following:
Activate & implementation for
d business intelligence and  Proofs of adopting and implementing all business
analytics management intelligence and analytics management and governance
and governance. processes.

 Proofs of adopting and implementing comprehensive


governance that includes emerging topics such as ethics of
artificial intelligence.

 Proofs of adopting and implementing advanced analytics


management and governance processes (AI/ML Ops).

219
- The approved Operating - A document must be attached showing the following:
Model with defined roles
& responsibilities for The  The approved Operating Model of the Data Science team.
Data Science team.
 Process flow documentation.

 The roles & responsibilities of the Data Science team which shall
include, as a minimum:

 Data Scientists.

 Data Engineers.

 Visualization Engineers.

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

- Evidences of training - A document must be attached showing the following:


courses & awareness
campaigns conducted.  Proofs of all conducted training courses.

 Proofs of all conducted awareness campaigns.

- The approved User - A document must be attached showing the following, as a minimum:
Acceptance Test (UAT)
Documents.  The approved plan(s) to conduct UATs.

 The approved report(s) on UATs.

- The approved outcomes - A document must be attached showing the following:


as Reports & Dashboards
produced for The Business  The reports which were generated & approved for each business
Units. unit.

 The Dashboards which were generated & approved for each


business unit.

 Proofs of fully activating self-service analytics, covering both


business intelligence and artificial intelligence/machine
learning (AI/ML).

- The Capacity planning - The Capacity Planning document must be attached showing the following, as a
document. minimum:

 The Current Status Analysis: Analyzing the current BIA


activities capacity.

 The Future Status Analysis: Analyzing the expected


demand for BIA services.

 The Requirements: Detailing the specific resources required


to satisfy the expected BIA Service demands, including
hardware devices, software, and employees.

220
 Capacity Gaps: Identifying the gaps between the current
& future capacities / capabilities, and defining strategies
to fill the gaps.

 Resource Allocation: Documenting the process of allocating


resources to support the Entity's BIA activities, including
capacity / capability allocation & resource management.

Level 4: - The Monitoring Report on - A report on the effectiveness of practices and processes for managing and
Manage the effectiveness of governing business intelligence and analytics should be included, using
d practices and processes predefined key performance indicators (KPIs).
for managing and
governing business - Each indicator’s data or card must include the following, as a minimum:
intelligence and analytics

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

through predefined key  Indicator’s Name / Code.


performance indicators
(KPIs).  Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic/operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

221
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

- The monitoring Report on - A report on the effectiveness of the training and awareness program should
the effectiveness of be included, using predefined key performance indicators (KPIs) including, as
training and awareness a minimum:
sessions conducted
through predefined key  Number of training and awareness sessions conducted.
performance indicators
 Other optional indicators include:
(KPIs).
 The degree of satisfaction of the participants or target
audience upon completion of the training course or
awareness campaign.

 The percentage increase in participation in subsequent


events or activities, and the percentage of people who
successfully completed the training course compared to
the total number of participants.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic/operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

222
- Performance Measurement - The report should include the following:
Report for the Business

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

Intelligence and Analytics  Team Performance Indicators: This includes tracking the
Team. team's performance through specific indicators such as
execution level, quality of work produced, and adherence to
timelines.

 Goal Achievement: The report analyzes the team's


achievement of pre- defined goals, such as the development
and implementation of specific analytics.

 Data and Analytics Quality: The quality of the data used in the
analytics and the accuracy of the resulting insights are
evaluated.

 Work Efficiency: The overall work efficiency of the team is


estimated by analyzing completed tasks and the effective use
of tools and techniques.

 Feedback and Satisfaction: The report may also include


surveys of feedback from clients or internal users to assess
their satisfaction and measure the quality of the services
provided.

 These are just examples, and the content of a Business


Intelligence and Analytics Team Performance Measurement
Report may vary depending on the needs and goals of the
organization.

Level - Establishment of - A document must be attached to confirm the establishment of the Business
5: Business Intelligence Intelligence and Analytics Center within the Entity. The document should
Pionee and Analytics Center include the center's name, objectives, and responsibilities.
r Document.

- The continuous review and - A report must be attached showing the Reviews, Continuous Improvement &
improvement of business Governance Plan for BIA practices including the following, as a minimum:
intelligence and analytics
management and  The Review process: It includes the stakeholders’ roles &
governance practices. responsibilities, e.g., the data science / BIA team, business &
IT stakeholders. Review methods & tools may be
referenced, e.g., Data Collection & Data Analysis methods &
tools.

 Schedule: The timetable for conducting the review. The


schedule may be based on a specific timeframe, e.g.,
Quarterly, semiannually, or annually, or it may be based on
specific incidents/events, such as the completion of a major
project.

 Metrics: The measures used to assess the effectiveness of BIA


Practices & BIA Governance, which may include:

223
 Performance Metrics, such as the accuracy of the
reports & dashboards.

 Compliance Metrics, such as conformity with Data


Security & Privacy Policies.
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

- Continuous Review and - The document should include, at a minimum, the following:
Improvement of Business
Intelligence and Analytics  Team Performance Review: Regularly collect and analyze data
Team Performance. related to team performance, utilizing any previous
measurements of team performance.

 Identify Strengths: Identify areas of strength in performance to


enhance and/or leverage them in other tasks or projects.

 Identify Areas for Improvement: Identify weaknesses in


performance and determine areas that need improvement.

 Enhance Training and Development: Training and


development opportunities provided to team members to
enhance their skills and capabilities in the field of business
intelligence and analytics based on regular performance
reviews. These opportunities may include workshops,
training courses, and individual mentoring.

- The Updated Capacity - An updated document must be attached showing Capacity Planning including:
Planning document.
 The Current Status Analysis: Analyzing the current BIA
activities capacity.

 The Future Status Analysis: Analyzing the expected


demand for BIA services.

 The Requirements: Detailing the specific resources required


to satisfy the expected BIA Service demands, including
hardware devices, software, and employees.

 Capacity Gaps: Identifying the gaps between the current


& future capacities / capabilities, and defining strategies
to fill the gaps.

 Resource Allocation: Documenting the process of allocating


resources to support the Entity's BIA activities, including
capacity / capability allocation & resource management.

- The Revised - The revised process document of BIA Service Demand Management must be
Demand attached showing the following:
Management
Process.  The review results summary of the Demand Management
Process.

 Demand Analysis: Updates on the process of analyzing the BIA


service demand, including request volume, frequency, and
complexity.

224
 Request Prioritization: Updates on the process of prioritizing BIA
service requests, including updates on the criteria that will be
used to prioritize each request.

 Resource Allocation: Updates on the process of allocating


resources to support the BIA service demand.

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

 Communication: Updates on the process of communicating the


BIA service demand status to the stakeholders, and about the
performance of the Demand Management process.

 Governance & Compliance: Updates on the Governance &


Compliance process to implement this approved process for
BIA service Demand Management.

Checklist – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the
BIA.MQ.4
BIA use cases?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The list of BIA tools. - A document must be attached covering the list of tools used for the Entity’s BI
Establishing activities. For each tool being used, the following must be indicated:

 The name of the tool.

 The users of the target audience.

- The list of reports. - A document must be attached covering:

 The list of names of BIA report which were generated.

 Complete copies of the BIA reports.

225
Level - The list of business units - A document must be attached covering:
2: which use the BIA
Define technology tools.  A list of names of the business units that use BI technology tools.
d
 Details of business units that use BI technology tools, which
include, as a minimum:

 The name of the business unit.

 The name of the tool utilized.

 The version of the tool.

 The purpose of the usage.

Checklist – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the
BIA.MQ.4
BIA use cases?

Levels Acceptance Evidence Acceptance Criteria

 The Stakeholders Cooperation to provide the Semantic


Layer’s details.

Level 3: - The list of users with role - A document must be attached covering:
Activate and privileges.
d  The list of users with their roles & privileges.

- The Approved - A document must be attached describing the Approved Architecture & the
Architecture & Approved Documentation for the Entity’s Advanced Analytics, which may
Documentation for include:
Advanced Analytics.
 The Advanced Analytics Architecture, including technologies
required to support Advanced Analytics.

 Data Storage: Data Storage solutions required to store and


manage large data quantities to be used in Advanced
Analytics.

 Data Processing: Processing Algorithms & Technologies used to


analyze & model data within an Advanced Analytics system.

 Analytics Applications: The applications & tools used to create


& deploy analytical models & insights.

 Governance: The Processes & Policies required to control


the use of Advanced Analytics within the Entity, including
Data Security, Data Privacy, and Compliance Requirements.

 The Implementation Plan: The steps required to implement


the Advanced Analytics system within the Entity, including

226
timetables & requirements such as budgets & resources.

- Advanced Analytics and - The following documents should be included:


Communication Project
Management Documents.  Documents for advanced analytics project management, such
as project planning and scheduling, project organization and
governance.

 Communication documents, including procedure documents


and templates for updating project status reports, progress,
and results, and sending them to relevant parties
(stakeholders), including executive sponsors, business users,
and IT staff.

 Document management and sharing mechanisms.

Checklist – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the
BIA.MQ.4
BIA use cases?

Levels Acceptance Evidence Acceptance Criteria

- The Approved Advanced - A document must be attached listing all models developed for Advanced
Analytics Models Analytics, model verifications and model approvals for usage. For each
Documents. Advanced Analytics Model, the following must be documented, as a minimum:

 The purpose of the Model.

 The data sources used.

 The methodology used.

 Verification, Validation, and Testing processes.

227
Level 4: - Report on the - The report on the effectiveness monitoring of business intelligence and
Manage Effectiveness Monitoring analytics tools and techniques should include the following:
d of Business Intelligence
and Analytics Tools and  Key Performance Indicators: Predefined key performance
Techniques through indicators and actual measurements for the adoption and
Predefined Key utilization of business intelligence and analytics tools. These
Performance Indicators. indicators may include the number of participating users, data
processing volume, response time, and performance.

 Tool and Technique Usage: Evaluation of the extent of usage of


the adopted tools and techniques for data analysis and value
extraction. This may include details on usage ratios and
distribution among different departments and teams within the
organization.

 Results and Analysis: The results, analysis, and insights


achieved through the use of business intelligence and
analytics tools.

 Recommendations and Improvements: Recommendations for


improving the usage of tools and techniques and enhancing
their adoption within the organization. Areas for improvement
are identified, and appropriate guidance is provided to
enhance effectiveness and optimize the benefits of business
intelligence and analytics tools.

 Success Evaluation: Evaluation of the success of adopting and


utilizing business intelligence and analytics tools. Results are
measured against expected objectives and outputs, and an
analysis is provided on the achieved benefits and ongoing
challenges.

228
Checklist – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the
BIA.MQ.4
BIA use cases?

Levels Acceptance Evidence Acceptance Criteria

Level - Document for Continuous - The document for continuous adoption should include the following:
5: Adoption of Technologies,
Pionee Tools, Frameworks, and  A list of new tools and technologies that have been evaluated
r Features. or tested and identified.

 Advanced technologies (such as Spark, TensorFlow, and


PyTorch) that have been integrated into the artificial
intelligence/machine learning environment.

 A list of advanced capabilities, such as tools and engineering


features, in artificial intelligence/machine learning that are
utilized for complex processing and reusability.

 A list of tools and techniques used in artificial


intelligence/machine learning operations for
comprehensive management of the artificial
intelligence/machine learning lifecycle.

- The Continuous - A document must be attached for the new technological solution’s roadmap of
Improvement Report of advanced analytics (POC), which is a detailed plan for testing and evaluating
the BIA and Advanced new technologies or tools in the advanced analytics field. This is to prove the
Analytics Technology existence & implementation of a continuous improvement mechanism. The
Solutions including the Report must include the following, as a minimum:
Proof of Concept (POC)
Roadmap.  The Scope: Specifying the scope of the new technological
solution’s POC, including the technologies being tested,
data sources, and the target audience.

 The Success Criteria: Identifying the success criteria and


metrics that will be used to assess success.

 The Timeline: Setting realistic timetables, including start & end


dates and milestones.

 Resources: Determining the resources needed to implement


the new technological solution’s POC, including technologies,
data, employees, and budget.

 Risks & Challenges: Identifying potential risks & challenges


associated with the new technological solution’s POC and
develop a mitigation plan.

 The Communication Plan: Developing a communication plan


to keep stakeholders informed about the progress and the
results of the new technological solution.

 Next Steps: Determining the next steps following a POC,


such as upscaling the new technological solution,
performing extra tests, or additional implementations.

229
Checklist – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the
BIA.MQ.4
BIA use cases?

Levels Acceptance Evidence Acceptance Criteria

- A Report on the Proof of - A report must be attached containing the POC results of each new
Concept (POC) Results. technological solution, including, as a minimum:

 An Executive Summary: A brief overview of the main


objectives & key results.

 The Methodology: A Description of the methodology used in


conducting the POC & performing the periodic tests of the
technological solution.

 The Results: A detailed analysis of the POC results & the


periodic testing results, including the solution’s strengths &
weaknesses, predictions accuracy, and any limitations or issues
encountered.

 Recommendations: Any insights or recommendations derived


from the results.

 The Conclusion: A conclusion that summarizes the POC


results & highlights the business implications.

 The Future Directions: A discussion of the project’s potential


next steps, including any recommendations for further
development, testing, or deployment.

 Appendices: Supporting materials such as technical


documentation, programming code samples, data
visualizations, and interfaces.

230
8.2.10. Data Value Realization Domain

Checklist – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and
DVR.MQ.1
implement Data-related cost optimization initiatives and use cases?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing List of Data Value - A document must be attached covering a Proof-of-Concept (POC),
Establishing Realization (DVR) Activities. or other temporary tasks performed as practices that support
Data Value & Benefit Realization within the Entity. The document
must include the following:

 The currently existing DVR Domain practices.

 A separate description for each activity including, as a


minimum:

 The name of the activity.

 The purpose of the activity.

 The achieved or expected output of the activity.

Level - The Data Value Realization - The DVR Plan must be attached to activate Data revenue generation
2: (DVR) plan. capabilities (Profitable revenue from data) and activate Data based
Define cost-reduction initiatives. The plan shall include the following, as a
d minimum:

 A roadmap that includes the activities and milestones for


the DVR use cases. The activities shall incorporate what
is needed to achieve this Domain’s specifications, as a
minimum.

 The assignment of the required resources & budget


allocation to manage the implementation of the DVR
use cases.

- The List of Identified Use - A document must be attached covering the DVR use cases. Each use
Cases for Both Revenue case should be documented with the following details, as a minimum:
Generation & Cost
Optimization.  Name of use case.

 Type of use case e.g.:

 Data Revenue Generation Use Cases – Data


or Data Products which generate revenue for
the Entity.

231
 Cost Saving Use Cases – Data related cases which
will directly or indirectly contribute to reducing
expenses and achieving greater efficiency.

Checklist – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and
DVR.MQ.1
implement Data-related cost optimization initiatives and use cases?

Levels Acceptance Evidence Acceptance Criteria

 The stakeholders required to implement the use cases,


the official who will lead the use case, and the target
beneficiary who will benefit from the implementation of
the use case.

 A list of business requirements needed to implement


the use cases.

 Data sources and required data fields.

 The technology required to implement the use cases.

- A Document Explaining the - A document must be attached containing the following:


Payback Period and Return
on Investment (ROI) for  An estimation of the payback period and the ROI for
Each Identified Use Case. each DVR use case.

232
Level 3: - A Report on DVR Monitoring - A document must be attached showing the implementation status of
Activated & Maintenance. the activities being done for data value realization including the
following, as a minimum:

 The DVR plan status report.

 The organizational structure of the DVR activities.

 Use case’s monitoring and maintenance status report.

- The monitoring and maintenance of the DVR use cases must include the
following, as a minimum:

 Measuring and verifying / validating the KPIs (ROI &


Payback Period) against the projected income values
in the DVR & Income Plan.

 Developing Change Request (CR) documents to


accommodate change requirements from the end
users.

 Reporting defects & malfunctions in the


implemented use cases.

Checklist – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and
DVR.MQ.1
implement Data-related cost optimization initiatives and use cases?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report of - The Monitoring Report of the implemented DVR Use Cases must be
Manage the DVR Use Cases with attached. The report must include the following indicators, as a
d Pre- defined KPIs. minimum:

 The number of Data Products developed.

 The number (i.e., Quantity) of Data or Data Products


revenue generation requests raised to NDMO.

 The DVR Use Case Payback period.

 The DVR Use Case Return on Investment (ROI).

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

233
 Indicator’s Description.

 The strategic / operational objective to be


measured (with identifying the Specification or
the Process to which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity,


etc.).

 Baseline (Measurement value in the first measurement


year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly /


Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative.

- The Monitoring Report of - The Monitoring Report must be attached containing the DVR activities &
the DVR Activities & Plan plan with the pre-defined KPIs (Indicator Cards).
with Pre- defined KPIs.
- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

Checklist – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and
DVR.MQ.1
implement Data-related cost optimization initiatives and use cases?

Levels Acceptance Evidence Acceptance Criteria

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is

234
the target, Negative Polarity: Lower indicator value is the target).

Level - Review & Continuous - The Entity must provide document for the periodic reviews, results, and
5: Improvement Document of continuous improvement mechanisms of the DVR Plan, and a list of any
Pionee the DVR Plan. improvements done on the plan.
r

- The Revised & Updated DVR - A document must be attached including the following, as a minimum:
KPIs.
 The revised & updated DVR KPIs.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be


measured (with identifying the Specification or
the Process to which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity,


etc.).

 Baseline (Measurement value in the first measurement


year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly /


Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

Checklist – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and
DVR.MQ.1
implement Data-related cost optimization initiatives and use cases?

Levels Acceptance Evidence Acceptance Criteria

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity: Lower
indicator value is the target).

- Evidences of New - A document must be attached proving new partnerships, which


Partnerships (e.g., MoU, may contain proofs such as:
jointly developed use cases
or products, etc.).  Memorandum of Understanding (MoU): In the event of
signing a Memorandum of Understanding, a copy of the
MoU must be attached.

 Jointly Developed Use Cases or Data Products: If the

235
partnership contributed to the development of new use
cases or products, they must be documented and
described. There must be a brief explanation of each use
case and each data product, its purpose and the value it
may provide.

Checklist – Data Value Realization Domain

DVR.MQ.2 Has the entity implemented practices to support a data revenue generation process?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing Practices - A document must be attached containing the current practices which
Establishing Related to Supporting the support Data revenue generation including, as a minimum:
Data Revenue Generation
Process.  A list of the current activities (reports and/or
dashboards...etc.) related to the DVR Domain
practices.

Level - The Defined Pricing Scheme. - A document must be attached containing:


2:
Define  An appropriate Pricing Model for any Data and for
d any Data Product that’s expected to generate
revenue.

Checklist – Data Value Realization Domain

DVR.MQ.2 Has the entity implemented practices to support a data revenue generation process?

Levels Acceptance Evidence Acceptance Criteria

- The Data or Data Product - A document must be attached containing the total expense calculation
Price Calculation. for each Data or Data Product which is expected to generate revenue.
The total cost shall include the following as a minimum:

 The Data Collection Cost – The total of the costs


incurred for collecting, cleansing, and curating data.

 The Data Development Cost – The total of the costs


incurred for developing analytical models, Data
visualizations and other value- added services provided
on top of the collected data.

236
- The Adopted / Approved - A document must be attached containing the approved Charging
Charging Model for Each Calculation Model for each Data or Data Product that’s expected to
Data or Data Product. generate revenue.

- Here are examples of Charging Calculation Models, but not limited to:

 The Subscription Model.

 The Consumption Based Model.

 The Freemium / Premium Model.

 The One Time Fee Model.

Level 3: - Evidences of Revenue - A document must be attached containing proofs of requests sent to
Activate Generation Requests NDMO-SDAIA for revenue or income generation from Data or from Data
d Submitted to NDMO-SDAIA. products.

- Each request shall include the following, as a minimum:

 The Description of the Data or the Data Product.

 The Documentation of the Data Product Pricing Scheme /


Model & the Service Pricing Scheme / Model.

 The Proposed Charging Model.

 The Proposed Final Unit Price.

 The Justification if the Final Unit Price does not follow


the Cost Recovery Pricing Scheme.

Level 4: - The Monitoring Report of - A report must be attached showing the monitoring of the Data Revenue
Manage the Data Revenue Generation Process efficiency using pre-defined KPIs (Indicator Cards).
d Generation Process with
Pre-defined KPIs. - The report must include the following indicators, as a minimum:

 The number of Data Products that generated revenue.

 Total revenue generated from offering Data or Data


Products.

 The total cost saved from the implemented Cost


Saving Use Cases.

Checklist – Data Value Realization Domain

DVR.MQ.2 Has the entity implemented practices to support a data revenue generation process?

Levels Acceptance Evidence Acceptance Criteria

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

237
 Indicator’s Description.

 The strategic / operational objective to be measured


(with identifying the Specification or the Process to
which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement


year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly /


Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity: Lower
indicator value is the target).

Level - Continuous Improvement - A document must be attached containing, as a minimum, the following:
5: Report of the Data Revenue
Pionee Generation Process.  Continuous improvement mechanisms.
r
 Documents of the Periodic reviews and results of the
Pricing Model, Price Calculation for data or data
products, and the Charging Fee Calculation Model.

 The improvement recommendations or updates on data


revenue generation process based on the reviews
results.

238
239
8.2.11. Open Data Domain

Checklist – Open Data Domain

OD.MQ.1 Has the Entity defined, established, and implemented a plan to identify and coordinate the publishing of its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The existing list of Open - A document must be attached covering the current list of open datasets
Establishing Datasets. including the following, as a minimum:

- For each existing open dataset, the following information must be


provided, as a minimum:

 Name: The name of the open dataset.

 Description: A brief description of each open


dataset, including its source and any related
information.

 Download Links: The download link of each open


dataset.

Level - The Approved Open Data - Document must be attached including the following, as a minimum:
2: Framework.
Define  The approach/methodology used to make open data
d available to the public.

- The Approved Open Data Plan. - The approved OD plan must be attached including the following, as a
minimum:

 A roadmap that includes activities milestones for the


implementation of OD initiatives. The activities shall
include what is necessary to achieve the specifications of
this Domain, as a minimum.

 Allocation of the required resources & budget to


manage the implementation of OD initiatives.

- The OD Management - An Open Data Management Structure document must be attached


structure. including, as a minimum:

 The roles & responsibilities.

 The compliance audit framework.

240
Checklist – Open Data Domain

OD.MQ.1 Has the Entity defined, established, and implemented a plan to identify and coordinate the publishing of its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

- The Developed and - A document must be attached containing the change management
Approved Plan for Change plan that has been developed & approved including, as a minimum:
Management (Including
awareness campaigns).  The OD activities training plan.

 The OD activities awareness campaigns plan


including the following, as a minimum:

 The usage of open data, and its various positive


social & economic benefits.

 Promoting the Entity's open data & the related activities.

Level 3: - The Open Data Plan - A report must be attached showing the open data plan
Activated Implementation status implementation status including, as a minimum:
report.
 The achievement percentages of the initiatives & projects
included in the OD implementation plan.

- Evidence of Submission of - The Entity must attach the Annual Compliance Report submitted to
the Annual Compliance NDMO-SDAIA and attach the evidence of the submission.
Report to NDMO-SDAIA.

- Assignment Decisions / - A document must be attached including the hiring / assignment


Appointees to Job Roles. decisions for the following roles including job descriptions:

 Open Data & Information Access Officer (ODIAO).

 Business Data Executive (BDE).

 Business Data Steward.

- Evidence of Implementation - A document should be attached that includes the following:


of the Change Management
Program (The conducted  Evidence of the implementation of training courses related
training courses and the to Open Data.
launched awareness
 Evidence of the launch of awareness campaigns related
campaigns related to Open
to Open Data.
Data).

241
Checklist – Open Data Domain

OD.MQ.1 Has the Entity defined, established, and implemented a plan to identify and coordinate the publishing of its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report on - A monitoring report must be attached showing the Open Data Plan
Manage the Effectiveness of the Implementation & must be prepared based on pre-defined KPIs
d Open Data Plan through (Indicator Cards), and each indicator’s data or card must include the
Predefined Key following, as a minimum:
Performance Indicators
(KPIs).  Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured


(with identifying the Specification or the Process to
which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement


year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly /


Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity: Lower
indicator value is the target).

Level - The periodic reviews and - A document must be attached containing the periodic reviews and
5: improvements of the open improvements of the open data plan, including, at a minimum, the
Pionee data plan. following:
r
 Mechanisms for continuous review and improvement of
the plan and roadmap.

 Documented reviews and periodic results.

 Any improvements made to the plan and roadmap.

242
Checklist – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing Practices for OD - A document must be attached, explaining:


Establishing Identification.
 The Current OD Identification Practices.

Level - The Defined Process - A document must be attached containing the Open Data Lifecycle
2: Documentation for Management Processes including the following, as a minimum:
Define Managing the Lifecycle of
d Open Data.  Processes to identify public datasets to be published by the
Entity.

 Processes to ensure that datasets are published and


maintained to their appropriate format, timeliness,
comprehensiveness, and overall high quality and ensure
the exclusion of any restricted data.

 Processes for gathering feedback, analyzing performance


at the Entity level, and improving the overall Open Data
national impact.

- The Defined Process - A document must be attached containing the Open Data
Documentation for Identification Process including the following, as a minimum:
Identifying Open Data.
 Identifying and documenting all data classified as
'Public' and prioritizing each dataset included under
open data.

 Evaluating the identified datasets to determine


whether to be published as open data or not.

 Determining whether the combination of any publicly


available data and the data to be published would
constitute an unauthorized disclosure of Personal
Information, or any other Security or privacy risk or threat.

243
Checklist – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Levels Acceptance Evidence Acceptance Criteria

- The Process of Evaluating - Should include a document that outlines the process for assessing the
the Value and Impact of estimated value and potential impact of open or public datasets, which
Open or Public Datasets. should include, at a minimum, the following:

 Steps and methods for analyzing the value of each open


dataset from different dimensions such as economic (e.g.,
ROI), social, and environmental dimensions.

 Steps and methods for evaluating the potential impact


after sharing open or public datasets, such as increasing
transparency, improving decision-making processes, and
enhancing research and innovation.

 Risk assessment and mitigation methods for identifying


potential risks associated with open datasets, such as
security threats, data breaches, privacy concerns, Data
Quality (DQ) issues, intellectual property violations, etc.

Level 3: - The OD Identification - A report must be attached showing the implementation status of the Open
Activated Process Implementation Data identification process.
Status Report.

- The List of Identified Open - A document must be attached containing the list of public datasets
Datasets with the Assigned considered to be published as open data, with the classification &
Priorities. prioritization information for each dataset.

- For each dataset, the following information must be provided, as a


minimum:

 Name: The name of the dataset.

 Description: A brief description of each dataset,


including its source and any related information.

 Size: The size of each dataset in terms of the number of


records or file size.

- The Identified & Documented - A document must be attached containing the identified & documented
Metadata for the Open metadata of the open datasets including, as a minimum:
Datasets.
 The necessary metadata for each open dataset, to easily
identify, describe and search for it once published.

244
Checklist – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Levels Acceptance Evidence Acceptance Criteria

- Value and Impact - A value and impact assessment report for the identified open or public
Assessment Report for the datasets to be published must be attached, which should include, at a
Identified Open or Public minimum, the following:
Datasets.
 Results of evaluating the value and impact of the
identified datasets to decide whether or not to publish
them as Open Data.

 Risk assessment report for identified potential risks


associated with publishing the open datasets.

Level 4: - The Monitoring Report of - A report must be attached showing the OD Identification Process
Manage OD Identification & monitoring & must be prepared based on pre-defined KPIs (Indicator
d Prioritization Processes Cards), Including the key performance indicator "Number of identified and
with Pre-defined KPIs. prioritized open datasets”.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured


(with identifying the Specification or the Process to
which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement


year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly /


Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity: Lower
indicator value is the target).

245
Checklist – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Levels Acceptance Evidence Acceptance Criteria

Level - The Continuous - A continuous improvement document must be attached proving the
5: Improvement Report periodic reviews & outcomes of the OD identification processes and the
Pionee Showing the Documented implemented automation processes including the following, as a
r Periodic Reviews & minimum:
Outcomes of the OD
Identification Processes and  The document of periodic reviews & results which
the Implemented were documented for the OD identification
Automation. processes and the implemented automations.

 The continuous improvement mechanisms of the OD


identification processes and the implemented
automations.

 The processes which were implemented & automated to


support OD identification.

Checklist – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing Practices for - A document must be attached, explaining:


Establishing Publishing Open Datasets.
 The current practices related to Publishing Open Datasets.

Level - The Defined Process - A document outlining the Open Data Publishing Process must be attached
2: Documentation for showing steps to ensure that datasets are published and maintained to
Define Publishing Open Data. their appropriate format, timeliness, comprehensiveness, and overall high
d quality and ensure the exclusion of any restricted data.

- The Defined Process - A document outlining the Open Data Maintenance Process must be
Documentation for Open Data attached showing activities including, at a minimum the following:
Maintenance.
 Regular update and documentation of changes to its
published Open Datasets and associated metadata
whenever changes occur.

 Continuous review of the published Open Datasets to


ensure they meet the defined regulatory requirements.

246
 Maintenance of data traceability by documenting data
provenance and maintaining versioning history of the
datasets.

Checklist – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - Status Report on the - The report must be attached showing the implementation status of the open
Activated Implementation of the data publishing process.
Open Data Publishing
Process.

- Evidences of Published - The proof document should include the datasets published on the
Datasets on the Saudi Open National Open Data Portal under the Open Data License in the Kingdom
Data Portal. of Saudi Arabia, as outlined in the Open Data Policy issued by NDMO-
SDAIA. At a minimum, it should include the following:

 For each dataset that’s currently published on the National

Open Data Portal, the following must be documented, as a

minimum:

 Name: The name of the open dataset.

 License data.

 Description: A brief description including its source and any


related information.

 Size: The size in terms of the number of records or file size.

 Format: e.g.: CSV, JSON, or XML.

 Download Links: The download link.

 Publishing Date: The date the open dataset was first


published.

 Update Date: The last update date.

 Usage and downloads statistics.

- Evidences of Feedback / - A document must be attached containing proofs that there were
Comments Received on OD. feedback comments on the open datasets.

- Evidence of Formats Used - A document must be attached showing the formats of the published open
to Standardize Open datasets, such as CSV, JSON, or XML, and instructions on how to use the
Datasets in Machine open dataset according to its published format.
Readable Form.

247
- Evidence of Data Standards - A document must be attached showing evidence of the data standards
Applied on Open Datasets applied on the open datasets to ensure high data quality.
to Ensure High Data Quality
(DQ).

Checklist – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

- Open Data Maintenance - A report must be attached showing the Open Data maintenance process
Report. outcomes that include, at a minimum, the following:

 Periodic Data updates, and documentation of any changes


made on the published open datasets and on the
associated metadata (in the event of any changes.

 Ongoing reviews of published datasets to ensure that


they meet the specified regulatory requirements.

 Evidence of data traceability by documenting data


provenance and versioning history of the dataset.

- The Open Data Register - A document must be attached that includes the following, as a minimum:
Containing Records of Open
Data Activities and  A record that includes artifacts related to Open Data
Published Open Datasets. Domain activities and decisions made during the
data life cycle management.

 A record that includes a list of all open datasets,


reviews, and changes associated with them and their
metadata.

Level 4: - The Monitoring Report of the - The Entity must attach a report on monitoring the OD Publishing
Manage OD Publishing Process with Processes based on pre-defined KPIs (Indicator Cards) including the
d Pre- defined KPIs. following indicators, as a minimum:

 The number of downloads per published Open Dataset.

 The number of identified Open Datasets that have been


published.

248
 The number of updates performed on published Open
Datasets.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured


(with identifying the Specification or the Process to
which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement


year).

 Target value.

Checklist – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

 Measurement Periodicity (Monthly / Quarterly /


Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity: Lower
indicator value is the target).

Level - The Continuous - A continuous improvement document must be attached showing:


5: Improvement Report of the
Pionee OD Publication &  The documented periodic reviews & outcomes for
r Maintenance Practices. the OD publication and maintenance practices.

 The continuous improvement mechanisms of the OD


publication and maintenance practices.

249
8.2.12. Freedom of Information Domain

Checklist – Freedom of Information Domain

Has the Entity defined and established a plan to address its compliance with the requirements of the Freedom of
FOI.MQ.1
Information (FOI) Regulations?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing FOI Practices. - The Entity must attach a report on the current practices in the FOI Domain
Establishing containing implementation evidences of these practices.

Level - The Defined & Approved FOI - The Entity must attach the approved FOI Domain implementation plan
2: Implementation Plan & and roadmap including the following, as a minimum:
Define Roadmap.
d  A roadmap that includes the activities and milestones for the
achievement of full compliance with the FOI regulations
published by NDMO-SDAIA. The activities shall incorporate
what is needed to achieve this Domain’s specifications, as a
minimum.

 The assignment of the required resources & budget


allocation to achieve full compliance with the FOI regulations
published by NDMO- SDAIA.

Level 3: - The FOI Plan - The Entity must attach a report clarifying the implementation status
Activate Implementation Status including the following, as a minimum:
d Report.
- The achievement percentages of the initiatives and projects included in the
executive action plan of FOI.

- The Assigned Open - The Entity must attach the data of the appointed Open Data & Information
Data & Information Access Officer (ODIAO) including an evidence of the employment decision.
Access Officer (ODIAO).

250
Checklist – Freedom of Information Domain

Has the Entity defined and established a plan to address its compliance with the requirements of the Freedom of
FOI.MQ.1
Information (FOI) Regulations?

Levels Acceptance Evidence Acceptance Criteria

- FOI Awareness. - The Entity should attach a report proving its implementation of awareness
campaigns in FOI, with the aim of instilling a culture of transparency and
promoting it, and raising awareness of the policy of FOI issued by the
NDMO-SDAIA and the right to access public information. Awareness
campaigns should include, at a minimum, the following:

 Raising awareness among employees involved in the


processing of FOI requests to understand the main
obligations and requirements of the FOI policies issued by
the NDMO-SDAIA.

 Raising awareness about the FOI principles and their


application to the rights of beneficiaries.

Level 4: - The Monitoring Report on - The Entity must attach a report prepared based on pre-defined KPIs
Manage the Entity's FOI Plan & (Indicator Cards), and each indicator’s data or card should include the
d Activities with Pre-defined following, as a minimum:
Key Performance
Indicators (KPIs).  Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

251
- The Internal Audit Reports - The Entity must attach reports including the following:
on the Entity's
Compliance with the FOI  The conducted internal audits of monitoring the compliance
Regulations. with the FOI Regulations published by NDMO-SDAIA.

 The documented audit findings, submitted in a report to


the Open Data & Information Access Officer (ODIAO).

Checklist – Freedom of Information Domain

Has the Entity defined and established a plan to address its compliance with the requirements of the Freedom of
FOI.MQ.1
Information (FOI) Regulations?

Levels Acceptance Evidence Acceptance Criteria

 The corrective actions applied in cases of non-compliance,


with notifications to the regulatory authority or NDMO -As
it stated-, and documentation of these improvements in
the audit findings report.

Level - The Continuous - The Entity must attach a report showing periodic FOI plan reviews to
5: Improvement Report of ensure continuous compliance with applicable regulations and other
Pionee the FOI Plan. environmental requirements or influences. The report must include the
r following, as a minimum:

 The FOI plan review documents and the documented


periodic results.

 The FOI Plan continuous improvement mechanisms.

Checklist – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing Processes for - The Entity must attach evidences of a previously done process(es) and
Establishing the FOI Requests & response(s) to information request(s).
Responses.

Level - The Developed & Approved - The Entity must attach the developed and approved procedures and
2: FOI Request Processes & processes of FOI requests. The Entity must design and document a
Define Procedures standardized / unified process for information requests and develop
d Documentation. procedures for managing, processing and documenting requests for access
to public information in alignment with the FOI Regulations published by
NDMO-SDAIA.

252
Checklist – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Levels Acceptance Evidence Acceptance Criteria

- The Developed FOI Process - The Entity must attach the developed FOI process guideline and
Guide & FAQs. answers to the Frequently Asked Questions (FAQs). The guide must
clarify the following, as a minimum:

 The request preparation mechanism.

 The request sending mechanism.

 The response awaiting mechanism.

 The response review mechanism.

 The appeals mechanism (if required).

 Moreover, the FAQs may include:

 Common questions about the FOI process,


requirements, timelines, and other info related to
requesting information. Submitting FAQs can help
clarify common inquiries and provide useful info for
individuals seeking to submit FOI requests.

Level 3: - The Implementation / - The Entity must attach a report on the FOI request process
Activated Adoption Status Report implementation status covering the following:
on the FOI Request
Processes.  Granting access to a public information request(s).

 Denying a public information access request(s).

 Extending the time required to respond to specific requests.

 Notifying the requestor(s) if the required information is


available on the Entity’s website or is not within its
specialty.

- Evidences of Entity-wide - The Entity must attach evidences for Entity-wide communications and
Communication. public publications, in alignment with the FOI Regulations published by
NDMO-SDAIA, without contradicting the applicable regulations in the
Kingdom of Saudi Arabia (KSA). The Entity must publish the following
information on its official government website or websites linked to it:

 The Laws, regulations, instructions and regulatory


decisions applicable to the Entity / followed within
the Entity.

 The Entity's services provided with description detailing


how to obtain access to those services.

 The Entity's organizational structure including the


roles & responsibilities.

 The Entity's job vacancy information, except information of


security or military job vacancies as determined by the
security or military regulatory authorities or applicable KSA
Regulations.

 The Entity's annual strategic and operational reports

253
including the Entity's financial statements.

Checklist – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Levels Acceptance Evidence Acceptance Criteria

 The Entity's general statistics, news and updates on its


activities including the following, as a minimum:

 The number of the Entity's employees.

 The year of the Entity's establishment.

 The number of the Entity's services provided in the last


year.

 The Entity's up-to-date activities' descriptions.

 The projects provided by the Entity as stated in the FOI


Regulations regarding the risks which may affect
people’s lives, health and properties. The information
must contain the contact details of persons with valid
licenses granted by the Entity including the following,
as a minimum:

 The names of the persons.

 The postal addresses of the persons.

 The e-mail addresses of the persons.

 Information on projects offered or awarded by


the Entity as prescribed by the FOI Regulations
regarding any risk that may affect people’s
lives, health or properties. The information
must include the following, as a minimum:

 The names of the recipients.

 The execution periods.

 The technical analysis.

 The guidelines and leaflets that


raise the people's awareness of
their rights to the Entity’s FOI.

 If the information above is not


available or applicable, the Entity
must provide a justification with
these evidences, and in line with
the FOI Regulations.

- The Register of the - The Entity must attach the prepared request forms for access to Public
Received Request Forms Information -whether paper or electronic specifying the required
with the Responses. information to be
provided by the Requestor. The required information must include the
following, as a minimum:

 Information about the Requestor including name, address,


national ID.

254
 Description of Public Information requested by the requestor.

 Purpose behind the request for access to public information.


Checklist – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Levels Acceptance Evidence Acceptance Criteria

 Notice delivery method to the requestor (e-mail, national


address).

 Date of the request.

- The Identified Public - The Entity must attach evidence of publishing specific Public Datasets,
Datasets Shared Under under the FOI Regulations.
the FOI Regulations.
- Responding to FOI requests can sometimes lead to the publication of open
datasets as part of the Open Data (OD) initiatives.

- When a requestor submits an FOI request to obtain specific information


that is within the scope of FOI Regulations, and the request is granted, the
government Entity may provide the requested information in the form of a
dataset. If a published dataset meets the criteria of being publicly
available, machine-readable, and reusable, it can be considered Open
Data (OD).

- Evidences of the - The Entity must attach evidence of the FOI communication publications,
Published FOI including guidelines and Frequently Asked Questions (FAQs), on the Entity’s
Communications official website, in line with NDMO requirements.
Including Guidelines &
FAQs on the Entity's
Official Gov Website in
Line with the NDMO
Requirements.

- The Pricing Scheme for - The Entity must attach a public information access request pricing
Public Information Access mechanism. The Entity must calculate and document the processing fees
Requests. of each granted / approved public-information access request, by
adopting a Pricing Scheme determined by the Entity and approved by
NDMO-SDAIA.

- The Up-to-date FOI Register. - The Entity must attach an updated FOI Register as the Entity must
document compliance records in a register as instructed in the FOI
guidelines published by NDMO-SDAIA. The Register must include the
following, as a minimum:

 Information on the current Open Data and Information


Access Officer (ODIAO).

 Public Information Access Request Records.

 Public Entity Publications.

 Any other records, including the manner and format, as


required by the FOI Regulations published by NDMO-
SDAIA.

Level 4: - The Monitoring Report - The Entity must attach a report on monitoring the Entity’s responses to FOI
Manage with Pre-defined KPIs requests based on pre-defined KPIs (Indicator Cards). Each indicator’s data
d for the Entity's or card should include the following, as a minimum:

255
Responses on FOI
Requests.

Checklist – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Levels Acceptance Evidence Acceptance Criteria

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Continuous - The Entity must attach a report on the updated FOI process(es) including
5: Improvement Report on the changes, with reference to revised copies of the documents which
Pionee the FOI Processes. identify the steps and procedures for a particular FOI process or task
r including:

 The review documents and the documented periodic


results of the updated processes, with the changes.

 The FOI Processes continuous improvement mechanisms.

- The Automated Tool for FOI - The Entity must attach a report describing (name, version, etc.) the
Requests. tool(s) used to automate the FOI request responding process.

- A tool refers to a program or an application designed to simplify and


automate the process of handling / processing FOI requests from
submission to completion. Such a tool can enhance efficiency, accuracy,
and transparency in managing FOI requests by automating various tasks,
reducing manual efforts, and ensuring compliance with FOI Laws and
Regulations.

256
8.2.13. Data Classification Domain

Checklist – Data Classification Domain

Has the Entity established a plan for Data Classification (DC) as stipulated by the Data Management and personal Data
DC.MQ.1
protection (DM & PDP) standards?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing DC Practices. - The Entity must attach a report clarifying the current Data Classification
Establishing Practices.

Level - The defined & Approved Data - The Data Classification Plan must include the following, as a minimum:
2: Classification Plan.
Define  The roadmap with the activities & milestones for the
d classification of Entity's Data. The activities shall
incorporate what is needed to achieve this Domain’s
specifications, as a minimum.

 The assignment of the required resources & budget to


manage the classification of the Entity's data.

Level 3: - The Data Classification - The Entity must attach a report clarifying the implementation status
Activated implementation plan status including, as a minimum:
report.
 The achievement percentages of the initiatives & projects
included in the Data Classification Implementation Plan.

Level 4: - The Implementation - The report must be prepared based on KPIs Data (Indicator Cards) which
Manage Monitoring Report of the DC were pre- defined in the Data Classification Plan, and each indicator’s
d Plan & Activities with Pre- data or card should include the following, as a minimum:
defined KPIs.
 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured


(with identifying the Specification or the Process to
which the indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement


year).

 Target value.

257
Checklist – Data Classification Domain

Has the Entity established a plan for Data Classification (DC) as stipulated by the Data Management and personal Data
DC.MQ.1
protection (DM & PDP) standards?

Levels Acceptance Evidence Acceptance Criteria

 Measurement Periodicity (Monthly / Quarterly /


Biannually / Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher


indicator value is the target, Negative Polarity: Lower
indicator value is the target).

Level - Data classification plan review - The Entity must attach a report showing that the Entity identified,
5: report. implemented and monitored continuous improvement mechanisms for
Pionee the Data Classification Plan including the following, as a minimum:
r
 The documented periodic reviews & results.

 The continuous improvement mechanisms.

Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Current List - The Entity must attach the current classified Datasets list.
Establishing of Classified
Datasets.

Level - Data Classification - Attach a data classification policy, which should include at a minimum:
Policy.
2:
Define
d
 Policy name.

 Release date.

 Release number.

 Document control (preparation, revision, approval)

258
 Version history.

 Terminology.

Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

 Objective.

 Scope of work.

 References.

 Policy owner.

 Policy statement including as a minimum:

 The main principles of classification

 Classification prioritization process and criteria

 Classification levels

 Classification controls

 Steps required for classification:

1. Define data.

2. Appointing the classification officer.

3. Conduct an impact assessment.

4. Conduct an impact assessment for low-impact


data.

5. Classification review.

6. Apply appropriate controls.

7. Roles and responsibilities

- The Data Handling and - The Entity must attach a document containing the Handling & Protection Controls
Protection Controls. for each Dataset & Artifact according to its classification, to ensure secure
handling, processing, sharing and disposal of data by following the policies &
regulations of the National Cybersecurity Authority (NCA). The document shall
include, as a minimum:

 The controls for protection, handling & processing of Public


Classification cases.

259
 The controls for protection, handling & processing of
Restricted Classification cases.

 The controls for protection, handling & processing of Confidential


/ Secret Classification cases.

Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

 The controls for protection, handling & processing of Top


Confidential / Top Secret Classification cases.

Level 3: - The Inventory Report of - The Entity must attach a document clarifying an inventory of all Datasets &
Activated the Identified Datasets Artifacts to implement the Data Classification Process, including as a
and Artifacts. minimum:

 Identifying the ownership of the Datasets & Artifacts.

 The list of all Datasets & Artifacts.

- The Prioritized Datasets - The Entity must attach a document showing the list of prioritized Datasets &
and Artifacts. Artifacts to be followed during classification.

- The Impact Assessment - The Entity must attach a report proving that the Entity conducts probable
Report. impact assessments (e.g., Any potential damage) when disclosing specific
Data or when specific Data was accessed in an unauthorized way. The Impact
Assessment process should be implemented for all identified Datasets &
Artifacts, including the following steps:

 The identification of the potential categories impacted, amongst


national interests, organizations, individuals and environment.

 The selection of the potential damage impact level for each


identified category amongst 'High', 'Medium', 'Low' and
'None / Insignificant'.

 The assignment of Classification Levels to Datasets & Artifacts


based on the selected impact level:

 If the impact level was assessed as 'High', then Datasets &


Artifacts shall be classified as Top Confidential / Top Secret.

 If the impact level was assessed as 'Medium', then Datasets &


Artifacts shall be classified as Confidential / Secret.

 If the impact level was assessed as 'Low', then Datasets &


Artifacts shall be classified as Restricted.

260
 If the impact level was assessed as 'None /
Insignificant', then Datasets & Artifacts shall be
classified as Public.

 The assessment alignment with the Data Classification


Policies & Regulations published by NDMO-SDAIA.

Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

- The Assessment Report - A report shall be attached proving that the Entity researched / assessed the
of Low-Impact Data. possibility of Classifying Low-impact Data as Public instead of Restricted. The
assessment shall include the following:

 Analyzing / evaluating if the disclosure of this Low-impact data


breaches any existing KSA regulation such as the Anti Cyber
Crime Regulations and the Electronic Commerce (e-Commerce)
Regulations.

 Identifying the potential benefits of disclosing / opening such


Datasets & Artifacts and assuring / considering whether those
would outweigh the negative impacts.

 Modifying the Low-impact classified data to be considered Public if


publishing / releasing would not breach any applicable / existing
regulation, especially if the benefits outweigh the negative
impacts.

 Aligning the assessment with the Data Classification Policies &


Regulations published by NDMO-SDAIA.

- Evidences of Utilization - The Entity must attach a report or a proof that the Data Catalog Tool was used for
of the Data Catalog Data Inventorying.
Tool for the Data
Inventory.

- The Approved Data - The Entity must attach a document showing the Data Access list with the
Access List of users assigned permissions / privileges, including the following, as a minimum:
with the Assigned
Privileges.  Specifying the types of users who need Data Access.

 Specifying the Data Access Authorizations (Read, Modify, Delete).

261
Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

- Data Register. - The Entity must attach a register / log documenting the list of all identified
Datasets & Artifacts, in addition to the activities implemented during the Data
Classification process. The register / log should include the following, as a
minimum:

 The list of the Entity's identified Datasets & Artifacts.

 The Classification Levels assigned to the identified Datasets &


Artifacts.

 The Assignment Dates of the classification levels to the identified


Datasets & Artifacts.

 The mandatory Classification Durations of the identified


Datasets & Artifacts.

 The Classification Levels approved / validated during the reviews.

 The Classification Levels' Review Dates.

Level 4: - The Monitoring Report - A monitoring report on the effectiveness of the Data Classification
Manage of the DC Processes Processes must be attached, including the following KPIs, as a minimum:
d with Pre-defined KPIs.
 The percentage of Classified Datasets & Artifacts (out of the
Entity’s total Datasets & Artifacts).

 The percentage of Datasets & Artifacts classified with specific


classification levels (out of the Entity’s total Classified Datasets &
Artifacts).

 The percentage of Low-impact data classified as Restricted.

 The percentage of Classified Datasets & Artifacts that were


reviewed, approved & validated.

- Each indicator’s data or card must include the following, as a minimum:

 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

262
Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - The Data Classification - The Entity must prove Continuous Improvement Mechanisms by attaching a
5: Automation Tool. report / document issued from the tool used in the automated DC process,
Pionee clarifying the automation processes & automation phases.
r

Checklist – Data Classification Domain

Has the Entity reviewed all its classified Datasets and artifacts to ensure that the classification levels assigned to them are
DC.MQ.3
the most appropriate ones as specified by the Data Classification (DC) Policies?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

263
Level 1: - The Current Practices of - The Entity must attach a report clarifying the current practices of
Establishing DC Reviews. reviewing the classification of all Classified Datasets & Artifacts for
the following:

 To ensure that the assigned classification level for each is


suitable, in line with the Data Classification Policies &
Regulations.

 To change the classification level if the Data status changes.

Level - The DC Review - The Entity must attach a document specifying the Data Classification Review
2: Mechanism. Mechanism including the following, as a minimum:
Define
d

Checklist – Data Classification Domain

Has the Entity reviewed all its classified Datasets and artifacts to ensure that the classification levels assigned to them are
DC.MQ.3
the most appropriate ones as specified by the Data Classification (DC) Policies?

Levels Acceptance Evidence Acceptance Criteria

 Verification of accuracy of the data collected.

 Verification of data classification levels.

 Verification of validity.

 Identifying errors and corrections.

 Documenting the Data Classification Review Process.

Level 3: - The Data Classification - The Entity must attach a report that proves periodic Data Classification
Activated Review Report. reviews, and includes, as a minimum:

 The Reviewed Classified Datasets & Artifacts.

 The Decisions made resulting from the Review Process.

- An Evidence Document - The Entity must attach a document proving that the Entity published on the
of the Published comprehensive Data Catalog the classification levels assigned to the Datasets.
Classification Levels as The Metadata must be published according to the process defined in the Data
Metadata. Catalog & Metadata (MCM) Data Management (DM) Domain (i.e., Metadata &
Catalog Management (MCM)).

264
Level 4: - The Monitoring Report - The report must be prepared based on KPIs Data (Indicator Cards) which were
Manage of the DC Review pre-defined for the Data Classification Review Mechanism, and each indicator’s
d Mechanism with Pre- data or card should include the following, as a minimum:
defined KPIs.
 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Checklist – Data Classification Domain

Has the Entity reviewed all its classified Datasets and artifacts to ensure that the classification levels assigned to them are
DC.MQ.3
the most appropriate ones as specified by the Data Classification (DC) Policies?

Levels Acceptance Evidence Acceptance Criteria

Level - Data classification review - Attach a report showing that the entity has identified, implemented and
5: mechanisms review monitored mechanisms for continuous improvement of data classification review
Pionee report. processes.
r

265
8.2.14. Personal Data Protection Domain

Checklist – Personal Data Protection Domain

Has the Entity performed an initial Personal Data Protection (PDP) assessment and developed a plan to address the strategic
PDP.MQ.1
and operational Privacy requirements?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of the existing - A report must be attached showing current PDP& data privacy practices,
Establishing practices of the PDP including proofs for those practices.
Domain and Data Privacy.

- The Initial PDP Assessment - The assessment should be aligned with national Personal Data Protection Law
Result. (PDPL) and include the following, as a minimum:

 Identification of the types of personal data being collected.

 Location & method of personal data storage.

 Current processing & uses of the personal data.

 Privacy challenges to meet compliance with the Personal Data


Protection Regulations published by NDMO-SDAIA.

Level - The Approved PDP - The plan should include the following, as a minimum:
2: implementation plan.
Define  A roadmap of activities & milestones to achieve conformity
d commitment and maintain full compliance with the PDP Policies
published by NDMO- SDAIA. The activities shall include what is
necessary to achieve the PDP Domain specifications, as a
minimum.

 Allocating the required resources & budget to achieve full


compliance with the PDP Policies published by NDMO-SDAIA.

- The PDP Training plan. - A report must be attached including an approved valid plan for training
the Entity’s employees in the PDP Domain, including, as a minimum:

 The scope & objectives of the training process including the


various PDP Domain topics mentioned in the “Data
Management and Personal Data Protection (DM & PDP)
Standards” document.

 The methods & channels through which the training


plan will be implemented.

 The training plan execution dates.

266
Checklist – Personal Data Protection Domain

Has the Entity performed an initial Personal Data Protection (PDP) assessment and developed a plan to address the strategic and
PDP.MQ.1
operational Privacy requirements?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - The PDP Plan Implementation - A report must be attached showing the implementation status including, as a
Activated Status Report. minimum:

 The achievement percentages of the initiatives & projects


included in the PDP implementation plan.

- Evidence of PDP training - A report must be attached showing the implementation status of the PDP
activities conducted. training for all employees including, as a minimum:

 The details of the training conducted for the Entity’s employees.

 The training targets audience.

 The methods & channels for the training.

 The number of the Entity’s employees who were trained in


the PDP Domain, and a list of training topics including:

 Importance of Personal Data Protection and the


Impacts and consequences to the Entity and / or
Data Subject.

 Definition of Personal Data.

 Data Subject Data Rights.

 Entity and Data Subject Responsibilities.

 Notifications as for when the Entity and / or Data Subject


should be notified and how to handle inquiries about
personal data collection, processing and sharing.

Level 4: - The Monitoring Report for - The monitoring report must be prepared based on pre-defined KPIs, and each
Manage the PDP Plan with pre- indicator’s data or card should include the following, as a minimum:
d defined KPIs.
 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic/operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

267
 Data sources used to calculate the indicator.

 Data collection mechanism.

Checklist – Personal Data Protection Domain

Has the Entity performed an initial Personal Data Protection (PDP) assessment and developed a plan to address the strategic and
PDP.MQ.1
operational Privacy requirements?

Levels Acceptance Evidence Acceptance Criteria

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Level - Review report and - A report must be attached showing the Entity’s periodic review and
5: continuous improvement continuous improvement of the PDP plan to ensure continuous compliance
Pionee of the personal data with applicable regulations and other requirements or environmental
r protection plan. influences. The report must include, as a minimum:

 Documentation of periodic reviews of the personal data


protection plan and documented results.

 Continuous improvement mechanisms to update the data privacy


plan.

Checklist – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

268
Level 1: - Evidences of the existing - Proofs must be attached for the current PDP & Data Privacy initiatives
Establishing initiatives for PDP and Data performed by the Entity. For example, not limited to:
Privacy.
 The Entity’s current PDP policies & processes.

 Any documents presenting the personal data breach


identifications.

 The documentation of the Entity’s current practices of


consent management, and the rights of the Data
Subjects.

 Privacy risk assessments conducted by the Entity.

Level - The documented Data - The entity must attach a document of processes for data breach notifications
2: Breach Notifications of, as it requires the person in charge of data control or data processing who
Define Process. deals with personal data at the entity to notify the regulatory authority in the
d event of breach of personal data,

Checklist – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Levels Acceptance Evidence Acceptance Criteria

within the time frame specified in the personal data protection policy issued
for the National Data Management Office; Note that the time frame for
reporting is 72 hours.

- Please refer to the Personal Data Protection Policy issued by the National Data
Management Office for more detailed requirements.

- The documented Data - A document must be attached containing the procedures & processes of
Breach Management Data Breach Management. The Data Breach Handling Management Process
Process. shall include, as a minimum:

 Conducting an incident review by the Data Controller / Data


Protection Controller with the Regulatory Authority.

 Formulating an immediate response to the incident by


the Data Controller and/or Data Processor.

 Implementing the permanent corrective actions as


issued by the Regulatory Authority.

 Testing the implemented corrective actions to validate


the PDP solution(s) efficiency.

269
- The PDP & Data Privacy - The Entity shall attach a document containing the procedures & processes of
Notice and the Consent the PDP / Privacy Notice and Consent Management Process considering the
Management Process. following components, as a minimum:

 Define and document the processes of providing Data


Subjects with notice and requesting consent at all the data
lifecycle phases where/when data is collected as prescribed
by the PDP Policies & Regulations published by NDMO-
SDAIA.

 The Entity shall provide the Data Subject with all possible
options; and the Entity must get the Data Subject’s (Explicit
or Implicit) consent / approval regarding the collection, use or
disclosure of personal data.

 The Entity shall document and make available a PDP/Privacy


Notice for Data Subjects to read / review before or at the time
the Entity requests permission to collect personal data.

- The Data Subjects' Rights - The Entity shall attach the Data Subjects Rights Management procedures &
Management Processes. processes document as it’s a must to establish & document the operations
to support the rights of Data Subjects, in compliance with the PDP Policies &
Regulations published by NDMO- SDAIA, whereby a Data Subject enjoys the
following rights:

 Right to be informed.

 Right to access.

 Right to rectify / correct.

 Right to erase or destroy.

Checklist – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Levels Acceptance Evidence Acceptance Criteria

 Right to object.

 Right to restrict processing.

 Right to data portability / transfer.

- The Entity should inform the Data Subjects about their rights and provide
possible means by which Data Subjects requests are submitted, responded to
and tracked.

270
- Entity-Specific PDP Policies. - The Entity shall attach its PDP Policies document including:

 Policy Name.

 Release date.

 Version number.

 Document control (Preparation, review, approval).

 Policy Statement.

 Job roles & responsibilities.

 Version history / record.

 Terminologies.

 Objective.

 Scope of work.

 References.

 Policy Owner.

Level 3: - The Developed & Adopted - Evidences shall be attached proving the automation of the developed &
Activate Consent Management approved workflow for the consent management process.
d Workflow.

- Evidences of Notifications - Evidences shall be attached proving that the notifications were sent by the
Sent to the Regulatory person in charge of data control or data processing who deals with personal
Authority within the data at the entity, to the regulatory/legislative authority within the specified
Allotted Timeframe. reporting time frame of 72 hours. If no incident occurred, an approved letter
should be attached.

- Evidences of Data Breach - Evidences shall be attached proving that Data Breach Management, including
Management Including the identification & detection of data breaches which occurred. If no incident
Identified Data Breaches. occurred, an approved letter should be attached.

- The Results of the PDP Risk - The Entity shall attach a document containing the procedures & processes of the
Assessments. PDP risk assessments plan / operations, as it’s a must that the Entity conducts a
yearly risk assessment of the operation and use of the information systems
containing personal data, including the collection & processing of personal data,
and the storage & transmission of

Checklist – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Levels Acceptance Evidence Acceptance Criteria

271
personal data by each system whether automated or manual. The risk
assessment findings must be as follows, at minimum:

 Documented.

 Analyzed for impact and the occurrence likelihood.

 Evaluated against current regulations obligations and


resolution criticality.

- Evidences of Published - Evidences shall be attached proving that processes were published clearly,
Data Subjects Rights feedback was collected from users, especially Data Subjects.
Management Processes
and Feedback Received
from Data Subjects.

- The PDP Register. - The Entity should attach a report of the compliance records maintained (record of
any collection and/or processing of any personal data), and evidence showing
that the records were made available to the regulatory authority (NDMO) as
defined in the PDP Policies & Regulations.

Level 4: - The Monitoring Report for - The report must be prepared based on the data of the pre-defined Key
Manage the PDP & Data Privacy Performance Indicators (KPIs) (Indicator Cards), and each indicator’s data or
d Practices with pre- card should include the following, as a minimum:
defined KPIs.
 Indicator’s Name / Code.

 Indicator’s Owner.

 Indicator’s Coordinator.

 Indicator’s Description.

 The strategic/operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

 Indicator’s Equation.

 Measurement Unit (Percentage, Number / Quantity, etc.).

 Baseline (Measurement value in the first measurement year).

 Target value.

 Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

 Data sources used to calculate the indicator.

 Data collection mechanism.

 Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value
is the target).

Checklist – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

272
Levels Acceptance Evidence Acceptance Criteria

- The Compliance Monitoring - The Entity shall conduct internal audits to monitor compliance with the PDP
Report & Audit Results. Rules & Data Privacy Regulations and shall document its findings in a report
presented to the Data Protection Officer. In non-compliance cases, the Entity
shall take corrective actions and notify the Regulatory Authority and the
NDMO and shall document the corrective actions within the audit findings
report.

Level - The Documented Periodic - A report must be attached including updated PDP & Data Privacy processes &
5: Reviews & Outcomes for practices documents, showing the operations of reviewing the PDP & Data
Pionee the PDP & Data Privacy Privacy processes & procedures. The report shall include, as a minimum:
r Practices.
 Based on the identified matters / issues, such as risk re
assessment, the Entity must update the documents (Policies &
Procedures for combating everything related to breaching Data
Privacy) and must attach an evidence of such reviews.

- Evidences of Automation & - The Entity must address changes on the PDP processes & practices and must
Change Management for attach continuous improvement mechanisms to prove effective Change
PDP. Management in relation to the required changes.

- The Entity must attach an evidence of process automation (An updated


workflow, any system changes) to include these changes as part of the
implementation.

273
8.3. Appendix III – Operational Excellence (OE)
The Operational Excellence (OE) component of the NDI utilizes information
captured from various national data platforms (e.g.: ODP, DL, DMP, GSB,
NDC, and RDM) to evaluate the entity's operational efficiency and
effectiveness based on the practices currently applied in 6 Data Management
(DM) domains (subject to increase). For the first round of NDI, each entity will
be measured against these metrics: DSI.OE.02, OD.OE.01, MCM.OE.01,
MCM.OE.02, DO.OE.02, and DO.OE.03. Please refer to “The Operational
Excellence (OE) Document” for further details.

8.3.1. Data Sharing and Interoperability

Operational Excellence Metrics

Metric ID Metrics Platforms

- DSI.OE.01 - Government Service Bus (GSB)


- Percentage of attributes shared on the Government Service bus but not produced by the
agency.
- National Data Catalog (NDC)

- DSI.OE.02 - Percentage of systems integrated with National Data Lake. - National Data Lake (NDL)

- DSI.OE.03 - Time taken to process data sharing agreements. - Data Marketplace (DMP)

274
8.3.2. Open Data

Operational Excellence Metrics

Metric ID Metrics Platforms

- OD.OE.01 - Percentage of datasets published in the Open Data Platform.

- OD.OE.02 - Delay/Lag in refreshing open datasets.

- Open Data Platform (ODP)

- OD.OE.03 - Number of reported issues for the published datasets.

- OD.OE.04 - Delay in resolving reported issues on published datasets.

8.3.3. Data Catalog and Metadata (MCM)

Operational Excellence Metrics

Metric ID Metrics Platforms

- MCM.OE.01 - Percentage of systems catalogued in the National Data Catalog.

- MCM.OE.02 - Percentage of business attributes defined and linked in the National Data Catalog.

- MCM.OE.03 - Percentage of reporting assets defined in the National Data Catalog.


- National Data Catalog (NDC)

- MCM.OE.04 - Percentage of business attributes linked to attribute class standards in the National Data
Catalog.

- MCM.OE.05 - Accuracy percentage of business attribute relationships in the National Data Catalog.

275
8.3.4. Reference and Master Data Management

Operational Excellence Metrics

Metric ID Metrics Platforms

- RMD.OE.01 - Percentage of published reference entities.

- Reference Data Management platform


(RDP).
- RMD.OE.02 - Time taken to publish new reference entities.

- Government Service Bus (GSB).

- RMD.OE.03 - Time taken to fix reported issues in reference entities

8.3.5. Data Quality

Operational Excellence Metrics

Metric ID Metrics Platforms

- DQ.OE.01 - Percentage of Data Quality index in the Government Service bus. - Government Service Bus (GSB)

- DQ.OE.02 - Percentage of Data Quality index in National Data Lake. - National Data Lake (NDL)

8.3.6. Data Operations

Operational Excellence Metrics

Metric ID Metrics Platforms

- DO.OE.01 - Percentage of delay in response time of the Government Service Bus APIs.

- Government Service Bus (GSB)

- DO.OE.02 - Percentage of failed API calls on the Government Service Bus.

- DO.OE.03 - Percentage of operational issues from agencies encountered by the National Data Lake. - National Data Lake (NDL)

276
246

You might also like