The Immunization Data Quality Self-Assessment (DQS) Tool: WHO/IVB/05.04 Original: English Distr.: General
The Immunization Data Quality Self-Assessment (DQS) Tool: WHO/IVB/05.04 Original: English Distr.: General
The Immunization Data Quality Self-Assessment (DQS) Tool: WHO/IVB/05.04 Original: English Distr.: General
04
ORIGINAL: ENGLISH
DISTR.: GENERAL
The immunization
data quality self-assessment
(DQS) tool
The Department of Immunization, Vaccines and Biologicals
thanks the donors whose unspecified financial support
has made the production of this publication possible.
All rights reserved. Publications of the World Health Organization can be obtained from Marketing
and Dissemination, World Health Organization, 20 Avenue Appia, 1211 Geneva 27, Switzerland
(tel: +41 22 791 2476; fax: +41 22 791 4857; email: bookorders@who.int). Requests for permission to
reproduce or translate WHO publications – whether for sale or for noncommercial distribution – should
be addressed to Marketing and Dissemination, at the above address (fax: +41 22 791 4806;
email: permissions@who.int).
The designations employed and the presentation of the material in this publication do not imply the
expression of any opinion whatsoever on the part of the World Health Organization concerning the legal
status of any country, territory, city or area or of its authorities, or concerning the delimitation of its
frontiers or boundaries. Dotted lines on maps represent approximate border lines for which there may
not yet be full agreement.
The mention of specific companies or of certain manufacturers’ products does not imply that they are
endorsed or recommended by the World Health Organization in preference to others of a similar nature
that are not mentioned. Errors and omissions excepted, the names of proprietary products are distin-
guished by initial capital letters.
All reasonable precautions have been taken by WHO to verify the information contained in this publica-
tion. However, the published material is being distributed without warranty of any kind,
either express or implied. The responsibility for the interpretation and use of the material lies with the
reader. In no event shall the World Health Organization be liable for damages arising from its use.
ii
Contents
Acknowledgments ............................................................................................................ v
Abbreviations .................................................................................................................. vii
Executive summary ......................................................................................................... ix
A. Introduction .............................................................................................................. 1
B. Immunization data quality self-assessment toolbox ........................................ 2
1. DQS options: overview ................................................................................... 2
2. Data accuracy .................................................................................................... 3
3. Completeness/timeliness of reporting ......................................................... 18
4. Assess the quality of the monitoring system ............................................... 20
5. Assessing the quality of immunization card recording
(health unit level) ............................................................................................ 22
6. Monitoring of wastage ................................................................................... 23
7. Monitoring of immunization safety .............................................................. 25
8. Denominators of immunization coverage ................................................... 26
C. Where to conduct a DQS? ................................................................................... 27
D. Present the DQS findings ..................................................................................... 30
1. Present the DQS results ................................................................................ 30
2. Using Excel to enter and represent the data ............................................... 33
E. Conduct a DQS workshop ................................................................................... 34
Some proposed workshop principles ..................................................................... 35
F. Integrate DQS results into the routine activities ........................................... 37
Annex A: Sample chart for monitoring doses administered
and drop-outs in children less than one year of age ......................... 39
Annex B: Example of a completeness/timeliness reporting table .................... 41
Annex C: Standard questions to assess the quality of the
monitoring system ................................................................................... 43
Annex D: Child immunization card exercise (example for 20 infants) ........... 57
Annex E: Sampling of health units ......................................................................... 59
Annex F: Data quality self-assessment workshop schedule .............................. 62
iii
iv
Acknowledgements
The data quality self-assessment (DQS) has been developed subsequently to the
immunization data quality audit procedure (WHO/V&B/03.19), which was designed
for use for the Global Alliance for Vaccines and Immunization (GAVI). The DQS
has been tested in a number of countries (Nepal, Morocco and Togo) in which local
support and feedback was extremely useful and appreciated. Respective WHO
regional and country offices and ministries of health (immunization divisions) of
these countries are deeply acknowledged. Financial support from GAVI has
contributed to the design and testing of the DQS.
v
vi
Abbreviations
AD auto-disable (syringe)
AEFI adverse events following immunization
BCG bacille Calmette-Guérin (existing TB vaccine)
DTP diphtheria–tetanus–pertussis vaccine
DQS data quality self-assessment
HU health unit
MOH ministry of health
NGO nongovernmental organization
NID national immunization day
OPV oral polio vaccine
QI quality index
QQ questions on quality
RED Reaching Every District
SE standard error
TT tetanus toxoid
UNICEF United Nations Children’s Fund
VVM vaccine vial monitor
VPD vaccine-preventable disease
vii
viii
Executive summary
What is the DQS? The DQS is a flexible toolbox of methods to evaluate different
aspects of the immunization monitoring system at district and health unit (HU) levels.
Immunization “monitoring” refers to the regular ongoing measurement of the level
of achievement in vaccination coverage and other immunization system indicators
(e.g. safety, vaccine management). Monitoring is closely linked with reporting because
it involves data collection and processing.
Target audience. This document is to be used primarily by staff who will adapt the
toolbox for a specific area (usually staff at national and regional levels). The adapted
tool should then be used by staff collecting and using immunization data at the national,
provincial or district levels.
Uses of the DQS. The DQS aims to assist countries in diagnosing problems and to
provide orientation to improve district monitoring, as highlighted in the Reaching
Every District (RED) approach.
ix
The final goal of the DQS is to integrate into routine practice the tool options that
are most relevant for a country so that constant attention is given to improve
monitoring practices and management of immunization activities.
How to use this document? A number of options for evaluating monitoring processes
are presented in this document. They should be explored, selected and refined
according to specific needs. The DQS does not aim to be standardized across
countries. The same flexibility is required when selecting where to conduct the DQS
in a country.
x
A. Introduction
The data quality self-assessment (DQS) consists of a flexible toolbox, designed for
staff at the national, provincial or district levels to evaluate different aspects of the
immunization monitoring system at district and health unit (HU) level in order to
determine the accuracy of reported numbers of immunizations and the quality of
the immunization monitoring system.
The options described in the toolbox (Section B) should be explored, selected and
refined according to specific needs. The tool does not aim to be standardized across
countries. The same flexibility should be applied for the selection of DQS sites,
which is discussed in Section C.
The DQS aims to diagnose problems and provide orientation to improve district
monitoring and use of data for action, as highlighted in the Reaching Every District
(RED) approach.1 Basic knowledge of Excel is helpful when entering and analysing
collected data but the self-assessment can be conducted without computerized support.
To date, two Excel workbooks are available for different components of the toolbox
(Section D).
The approach described here to introduce the DQS concept in one country is through
a national participatory workshop (see Section E) involving key people from the
national and district levels. This workshop is immediately followed by an assessment
in a number of districts and HUs that provides a self-diagnosis on the monitoring
system of the country. Other approaches can be developed and self-assessments can
be conducted à la carte.
The final goal of this assessment tool is to integrate the options that are most relevant for one
country into routine practice (Section F) so that constant attention can be given to improve
monitoring practices and management of immunization activities.
1
Increasing immunization coverage at the health facility level. Geneva, WHO, 2002 (WHO/V&B/
02.27). RED is a global strategy aimed at increasing coverage and decreasing drop-out rates. It is a five-
part strategy: reaching the underserved, providing supportive supervision, increasing use of data for
action, increasing micro-planning capacity at district levels and using local populations in planning
immunization sessions.
WHO/IVB/05.04 1
B. Immunization data quality
self-assessment toolbox
The DQS toolbox proposes several options to assess different aspects of the
monitoring system at different levels.
WHO/UNICEF
HU HU joint reporting
report report form
District National
HU tabulations/ tabulations/ tabulations
monitoring chart monitoring chart
Child
register
Community
(vaccination card)
WHO/IVB/05.04 3
The flow of information begins at the HU level. An HU is defined as the administrative
level where the vaccinations are first recorded; it might include private health facilities,
facilities of nongovernmental organizations (NGOs), hospitals, or a simple health
post. Typically, when a health worker administers a dose of vaccine, the date of
vaccination is immediately recorded on the child’s individual vaccination card and on
the immunization register and the dose is tallied on an appropriate sheet allowing for
the easy re-counting of all doses provided. The individual vaccination card is either
kept in the HU or (preferably) stays with the child’s caretaker (in the community)
while the register and the tally sheets are archived in the HU.
HUs usually report to a district health office on a regular basis (monthly or quarterly).
The HU report includes the number of doses of every antigen given during the
reporting period. To prepare the report, an HU officer obtains the number of doses
administered from the tally sheets. Alternatively he/she uses the child registers to
count the doses administered and put the added figure in the report.
The HUs should keep a copy of all reports sent to the district. The HUs should
display the cumulative number of doses administered in a graph on display to monitor
the progress towards coverage targets (Annex A, the monitoring chart).
At the district office level, administrative personnel receive the reports, log the date
they are received (e.g. on a completeness and timeliness chart – see Annex B), and
follow up on late reports. They then aggregate the information from all the HUs
they oversee and send a periodic district report to the national level (or to the next
intermediate level - if one exists). Tabulations (number of doses reported by each
HU) are made (computerized or not) to allow for the calculation of the district totals.
Copies of the reports sent to the national level are kept in the district office.
Important note: In parallel with the upward flow of information, data should be analysed at each
level and fed back to appropriate levels so that the information is used for direct action.
It is important to note that the availability of all the forms is subject to many factors,
including the national policy in use. It is recommended that reports and registers
should be kept for a minimal period of three years after the end of the calendar year
they have been used.
Note: The levels selected below include the district and HU levels only, but the
same principles apply should one or several intermediate levels exist.
The assessor will decide which source will be used to verify the information contained
in the HU reports. The HU monthly or quarterly report can be retrieved at the HU
or district level.
Accuracy of the HU sources can also be checked and bring useful information on
the correct use of one or the other tool. For example, the verification of tally sheets
against registers could lead to the finding that a higher number of tallied vaccinations
are due to the poor recording in the registers.
WHO/IVB/05.04 5
Important note: Full understanding of the correct and recommended recording and reporting
procedures is required when selecting the sources that will be verified. Recommendations do
vary from country to country and this influences the interpretation of results.
Example: In Zanzibar, according to the national policy, immunized children who do not belong
in the target area of one health unit should be tallied (on a tally sheet), but not recorded on the
health unit immunization register. Hence the comparison of re-counted immunizations in the
register and in the tally sheet for the same time period might bring out discrepancies attributable
to a correct practice (according to the national policy) and not due to poor recording.
The exercise is not only useful in detecting overreporting or underreporting but also
allows examination of the correct recording of immunization cards. It can also assess
the proper use of the immunization register and allow an estimation of valid doses
(i.e. doses given at the right time and with a proper interval).
In situations where the child was indeed vaccinated but the date put on the register
was systematically wrong (for example because the health worker puts the date of
planned vaccination instead of the actual date of vaccination), the exercise can provide
an estimation of timely doses, i.e. given in the recommended time schedule, according
to the information retrieved from the card.
Card retention in the community may be a problem and the assessors need to agree
on what to do in case of missing cards. It is recommended that the history of
vaccination by parents’ recall is used if a card is not available.
Similarly, the assessors need to think about which strategy to adopt if a child in the
community cannot be retrieved – option a. Reasons may indeed include overreporting
but also family move, temporary absence, etc. It is recommended to make every
attempt (including contacting neighbours, administrative entity, etc.) to verify whether
children recorded on a register exist.
In option b, the assessors should make sure that the vaccinations that are verified
from immunization cards in the community have been provided by the selected HU(s)
and not by other units so that they can potentially be retrieved in the registers.
WHO/IVB/05.04 7
Selection of children/mothers in a register (option a)
A minimum of 5–10 children/mothers should be selected per HU. According to
time and logistics, they can be selected from the register:
• from the same locality (to limit transportation costs) if the address is mentioned
in the register;
• by retrieving x of the most recently immunized infants/mothers in the register
(the most recent will be less likely to have moved from the area);
• by choosing randomly within a time period;
• or a combination of the above options.
Each time, the verified information (from the “lower” level in the data flow) is on
the numerator and the reported information (retrieved from the “higher” level in the
data flow) is on the denominator, so that:
• a percentage < 100% shows that not all reported information could be verified;
• a percentage > 100% shows that more information was retrieved than was
reported.
It is theoretically possible to develop several accuracy ratios, basically for each level
and source assessed against another one. The assessment should focus on accuracy
ratios that are most relevant in order to avoid confusion with a high number of
different accuracy ratios.
2.3.2 Interpretation
WHO/IVB/05.04 9
Possible reasons for very high verification: accuracy ratio > 100%
Underreporting
• Reports not complete at the time of forwarding
• No use of standard tools to adequately report the daily number of immunizations
performed
• Transcription or calculation error
Loss of information
First, we will obtain an accuracy ratio for each district, giving each HU its respective
weight. The weight of each HU corresponds to the proportion of HU population
out of the total sample:
= 447 / (447+321)
= 58.2%
Then, to obtain a provincial estimate, the weight of each district will be taken into
account. Each district accuracy ratio should be multiplied by the proportion of the
district in the province in a similar calculation:
which is the provincial HU registers/HU reports accuracy ratio estimate. One should
not simply take an average of the three accuracy ratios of the three districts to obtain
a provincial accuracy ratio such as: 97 + 84 + 105 = 0.95 or 95%. This is because
3
the weight of each district should be taken into account.
WHO/IVB/05.04 11
2.3.3.2 To aggregate accuracy ratios from different levels
One can also combine the accuracy ratios of two different levels to provide an overall
accuracy figure. The basic principle is to multiply the ratios. Procedures to obtain
an estimate for one level should have already been conducted as described in 2.3.3.1.
In the following example, the two accuracy ratios for the same antigen and time
period:
register HU / reports HU found at district, i.e. regHU
repHU
and
copies of all district reports found at the district/district data found in the
national tabulation, i.e. repDIS
tabDIS
(regHU) x (repDIS)
(repHU) (tabDIS)
Where
SE(p) = P x (1-P)
N
P is the accuracy ratio
N is the total sample size (reported values)
WHO/IVB/05.04
Vaccinations – number retrieved: Total Total
(selected months)b
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
Register
(tally the re-counted
immunizations)
Register (total) a a’
Month complete
(Yes/No)
Tally sheet b b’
Month complete
(Yes/No)
HU reports
(HU level) c c’
HU tabulation d d’
HU report e e’
(district level)
District
tabulation f f’
a
If information not available put NA. Accuracy ratios:
b
For the column “Total (selected months)” it is optional to circle the months to be considered. a/b: % a’/b’: %
b/e: % b’/e’: %
b/f: % b’/f’: %
13
14
Table 4: Collection sheet for accuracy of district tabulation. Example for three selected months and two antigensa
HU1
HU2
HU3
HU4
HU5
HU6
HU7
HU8
HU9
HU10
HU11
HU12
TOTAL (calculated sum of HUs) a1 a2
Date: ______/______/_______
a
Put NA if not available.
Accuracy ratios:
WHO/IVB/05.04
Indicate the year: __________________________________
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
Reports a
(district level)
Tabulation b
(district level)
Reports c
(national level)
Tabulation d
(national level)
a
If report not available put NA
Accuracy ratios:
c/d: %
b/d: %
15
16
Table 6: Data collection form for antigen X: from the register to the community
Serial Register Address Name Date Date Card Date Antigen X BCG Verified Remarksa
No. No. of child of birth antigen X possession antigen X vaccination scar
vaccination (card) history
(register)
1
2
3
4
5
6
7
8
9
10
a
The “Remarks” column can be used to record whether the vaccination is timely, whether the dose is valid, etc.
The information can be verified against the HU reports available at the HU,
aggregated data tables (if the HU had to aggregate the information) in HU tabulations,
HU reports available at the district level, or in tabulations (aggregated data tables)
of the district.
A number of accuracy ratio options are presented in Table 3 but it should be decided
which are going to be the most relevant.
Table 4 corresponds to (a) under paragraph 2.1.4, verifying the coverage data sent
by the district. It assesses whether the monthly totals of the HUs (as found in the
HU reports) correspond to the figure aggregated and sent by the district to the
higher level. It can also be used for assessing the availability of HU monthly reports
at district level. It should be adapted for the number of HUs in the district, number
of months for which the HU reports are checked, etc.
Table 5 corresponds to (b) under paragraph 2.1.4, verifying the coverage data sent
by the district. It compares all sources of information for the reported district figure,
either at district or national level. Again, the appropriate accuracy ratio(s) should be
chosen.
WHO/IVB/05.04 17
3. Completeness/timeliness of reporting
Each district should be monitoring the completeness and timeliness of units reporting,
as a quantitative core measure of the quality of the reporting system.
In both situations, findings should be discussed in terms of causes, actions that have
been taken to correct the problem, and solutions if the problem persists.
If the amount of time needed to verify each report of all HUs for one year is too
high, an alternative is to choose randomly a number of months for which the
information will be collected.
If this is not available, the procedures described under paragraph 3.1 can also be
done as a proxy for getting HU reporting completeness and timeliness figures.
However, non-available HU reports may indeed not have been obtained from the
lower level or may have been lost or destroyed (physically) by the district.
It is of particular interest for the national level to go into this option of the DQS;
usually, the national level has an idea of the district to national completeness and
timeliness but hardly knows the situation at the lower level (i.e. completeness and
timeliness of HU to district level).
WHO/IVB/05.04 19
4. Assess the quality of the monitoring system
4.1 Overview
The assessment of the quality of the immunization monitoring system is based on
questions or observations or exercises that can be posed or made or assessed at
each visited level (district, sub-district, HU, etc). Each question should have a
“yes”, “no” or “NA” (not applicable) response so that they can be given a score
according to the Yes or No response. A list of proposed questions for each level is
presented in Annex C. These questions/observations/tasks can be grouped into the
different assessed components of the monitoring system. Table 7 proposes a number
of components of the monitoring system based on the usual steps of the collection
and use of data.
The questions should be selected and revised according to each country situation.
The grouping into components is also adaptable: these components can be used as
well as other ones to be defined (e.g. availability of forms etc).
District data of the current and previous year should be analysed to identify and
quantify causes of poor data quality and to find solutions. This will help to refine the
qualitative questions that will be asked during the DQS process.
For each component and each level of the monitoring system, i.e. at district and
HU, average scores can be obtained and standardized as a percentage or on a scale
from 0 to 10.
The decision about which weight to assign to a question can be determined by asking
each participant in the questionnaire to score the question, then divide the sum of the
scores by the number of people in the team and choose the next round number to
determine the weight (Table 8 below). The weights for each question should be
agreed upon before the assessment.
Table 8: Method of assigning weights for each qualitative question in the DQS
(scores in the table are examples, allowed range in the Excel tool is 1–3)
Once the QQs have been selected, a form should be printed in hardcopy for the field
assessment.
WHO/IVB/05.04 21
5. Assessing the quality of immunization card recording
(health unit level)
The assessment of the quality of the immunization card recording can be done during
an immunization session: assessors ask the mother/father for filled cards after her/
his child has been immunized and check whether the vaccination(s) were correctly
provided and recorded. This is suggested in countries where the proportion of non-
valid doses has been shown to be high (from coverage survey data).
To conduct the exercise, ask the vaccinator to complete a health card for a child who
is supposedly brought to the HU on the day of the assessment. Then ask the vaccinator
to determine the next return date. This will assess the vaccinator’s abilities to determine
what vaccines are needed for a child and to correctly complete the vaccination card.
Annex D describes an example of exercise done for 20 children.
The observation and the exercise can be integrated into the quality index score and
one should determine which score to give in case of successful and unsuccessful
answers from the health worker (see QQs in Annex C).
6.1 Overview
Two options can be explored during a DQS at HU or district level:
• The first option is to go through the documents that provide information on
vaccine wastage and determine whether the wastage calculations and
monitoring are understood and done correctly. This can be assessed specifically
or through QQs.
Information about the number of used doses can usually be found in the
following documents:
− the HU/district vaccine ledger, describing all vaccine movements
(shipments/deliveries and despatches), with the balance;
− the HU/district monthly reports, where these contain information on the
number of vials used at the HU or in the district (sum of the HUs).
− stock receipts, invoices, etc.
• The second option is to review the documents, allowing for wastage calculation
for a specific time period, and determine the vaccine wastage for the setting.
This second option allows you to obtain a figure for the HU or the district,
discuss it, and promote monitoring of wastage based on real calculations.
6.2. Definitions
2
Unopened vial vaccine wastage can be calculated at the store level (district).
At district level, the wastage of unopened vials falls mainly into the following
categories:
• vaccines discarded due to vaccine vial monitor (VVM) indication,
• heat exposure,
• vaccines frozen,
• breakage,
• theft,
• vaccines discarded due to expiry dates,
• missing inventory.
2
Monitoring vaccine wastage at country level: Guidelines for programme managers. Geneva, WHO,
2003 (WHO/V&B/03.18).
WHO/IVB/05.04 23
The last item corresponds to “unexplained number of doses not matching an inventory
count” when one is conducted.
For example: on 1 August, according to the vaccine stocks ledger, the DTP balance
is 3000 doses but the physical inventory of the refrigerator contents of that day
records 2940 doses; the balance should therefore be adjusted to 2940 in the book,
with a note that 60 doses are “missing”. The 60 doses fall into the category of
unopened-vial wastage.
The unopened vial wastage is calculated as the proportion of unopened doses wasted
(numerator) out of the number of doses handled by the store (denominator), where:
Additionally, the total vaccine wastage occurring in one district can be calculated
from all figures coming from all HUs vaccinating in the district in addition to the
unopened vial wastage at the district store. This calculation needs information from
all HUs.
The global vaccine wastage rate (%) = 100 - vaccine usage rate, where:
vaccine usage number of doses administered
rate (%) = number of doses issued
and:
number of number of number of doses number of
doses issued = doses in stock at + received during - doses in stock
issued the beginning of the period at the end of
the period the period
Interpretation
Whatever the figure found at any level, it is crucial to try to identify and discuss the main causes of
wastage. The importance of monitoring wastage should always be stressed. The level of
immunization coverage should also be taken into account in the interpretation: classically at
higher coverage levels, including more difficult-to-reach children (e.g. through outreach sessions),
the wastage rate is likely to increase, and a higher wastage figure may be more acceptable.
(This list is only indicative: it is not exhaustive and does not consist of a minimal set
of necessary information.)
1. The verification that indicators have been effectively defined and that they
are well monitored. This can be done through QQs (see examples in Annex C,
QQ 6, 21, 42 and 50 for the district level).
2. The verification of the quantitative data collected which allowed for the
calculated indicator. This can be based on information available at the district
or in selected HUs. A procedure similar to the verification of coverage data
can be undertaken according to the selected indicator.
WHO/IVB/05.04 25
8. Denominators of immunization coverage
The different population groups targeted for routine immunization services are usually:
• infants (i.e. 0–11 months of age) for primary vaccinations, and
• pregnant women for TT vaccination.
Falsely high or low estimates of population numbers can introduce large inaccuracies
in coverage estimates. District and locality denominators are often officially provided
by a more central level, based on national statistics and census projection, but they
may be inaccurate. It is therefore of great importance for the more peripheral levels
(district and HU) to take into consideration local information to estimate and use a
number for their populations that is as precise as possible. This can be done with the
use of birth registries, local household census, etc.
In any case, the denominators should include the entire population living in the
catchment area of the HU or the district, even in the case of moving populations,
populations not registered, contraindications, etc.
The DQS is assessing the denominator issue through the QQs in order to explore
the understanding and practice (Annex C, QQs 23–31 for the district level, and
QQs 29–32 for the HU level). These questions can of course be adapted and revised.
A choice of sites to be visited must be made. However, the greater the sample size,
the more precise the results will be.
To provide a reasonable idea of the situation in one district, visits are recommended
to at least three HUs per district, with a maximum of six HUs. Visits to more HUs
are not are not recommended because this is not likely to provide additional
information and will wastes resources. Figure 2 shows that the maximal reduction in
the standard error (SE) – hence a higher precision, see para. 2.3.3.1 – is obtained
when the sample size is increased up to six HUs; then, with higher sample sizes, the
decrease of the SE is marginal. The DQS does not provide elements for sample size
calculation as it is felt that the discussions behind all obtained figures is more important
than the figures themselves. Common sense and logistic practicalities should dictate
the number of visited places.
WHO/IVB/05.04 27
Figure 2: Standard error reduction (in %) according to the sample size
(i.e. number of sampled health units)*
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
0 5 10 15 20 25
* Example calculating standard errors for a 1000 sample size per health unit with 50% accuracy ratio.
Alternatively, all districts/HUs can be assessed over a period of time by the higher
level, for instance, as part of supervisory visits.
The selection of the visited districts or HUs can be done according to the following
four options:
1) Representative selection, based on random sampling: This approach is based
on the assumption that the selection of sites should be representative of the
entire system if the recommendations are to be relevant to the whole system.
This option has the advantage of providing estimates which can be applied to
the district. It avoids any temptation to conduct the assessment in areas supposed
to be very strong or weak. Annex D provides guidelines on how to proceed
with random sampling.
2) Representative selection within defined strata: A stratum refers to a
subpopulation of an entity. It may be defined in many ways, e.g. according to
the importance of constitutional units (i.e. number of immunizations provided
by an HU), its type [e.g. hospital/HU or urban/rural], its location.
This option has the advantage of providing estimates which can be extrapolated
to subpopulations. It is useful to differentiate problems and actions which can
be different from one HU type to another.
Poor data quality HUs/districts also include those where the turnover of health
staff is high or where key posts are vacant. Supervision reports also indicate
good or poor recording, reporting and monitoring practices.
4) A combination of the above. This approach combines the advantages of the
problem-oriented approach and the fact that a selection bias for any “preference”
can be avoided.
The conclusions drawn from the sample will need to take the sampling strategy into
consideration. If the sample is not representative, then the results cannot be
generalized; they can only be extrapolated to the structures which were sampled.
Findings obtained from one district cannot be extrapolated to other districts. However
it is likely that common problems and difficulties are shared within a number of
areas. Results could be disseminated through feedback reports and meetings so that
solutions can be shared.
WHO/IVB/05.04 29
D. Present the
DQS findings
The data quality assessment provides a certain amount of information on the status
of records and practices related to the reporting system.
All options in the DQS toolbox provide quantitative measures which can be followed
easily over time and used to compare different areas. The use of the tool will be
particularly interesting when several districts can be compared or a district can be
compared to itself over time.
Assessment findings should be presented and discussed to the level that was assessed
but also to the national level so that lessons can be drawn and solutions proposed for
the whole country.
1.1 Accuracy
The raw figures and accuracy ratios can be presented in tables such as Table 9.
Graphic presentations can be helpful to present and discuss the findings. An example
is the use of a bar chart (Figure 3).
100%
80%
60%
40%
20%
0%
HU1 HU2 HU3 HU4 HU5 HU6
The accuracy can also be presented in terms of “accurate months” defined as months
for which the verified information was perfectly accurate (100% match).3
3
This can be defined and some flexibility may be allowed, e.g. 95%; 90%...
WHO/IVB/05.04 31
1.2 Quality index scores
The measures can also be presented using bar charts, in percentages or using the raw
numbers. The following representation (Figure 4), called a radar graph, provides a
way to compare all components: average scores are presented in this example on a
scale from 0 to 10. It is easily produced in Excel.
Recording
10.00
8.00
6.00
Every presentation should be followed up by an action plan – drafted at the time of the meeting –
outlining roles and responsibilities.
To date, two simple Excel workbooks are available to assist with data entry and
analysis:
• One is on the QQs, which aggregates the quality indices of selected HUs for
the district level. Automatic charts are presented using the radar graph option
described above.
• One is on the calculation of wastage, which aggregates the vaccine wastage
rates of selected HUs for the district level.
Instructions on how to use the workbooks are detailed on the respective “Read me”
worksheets of the two workbooks.
WHO/IVB/05.04 33
E. Conduct a
DQS workshop
Phase 2 consists of conducting the assessment itself, in the areas and facilities chosen
by the participants, using the forms they will have designed (field work). After the
assessment, the participants convene again to share the results, perform a global
analysis, and make overall recommendations.
The suggested timeframe in Annex F is four days for phase 1, then three days for
data collection and two days for data analysis and feedback, but this should be adapted
to the time available, logistics and the number of participants. If people cannot take
the above suggested time off, a DQS workshop can be organised in six days –
specifically two days theory, two days field work, one and a half days data analysis
and a half day providing recommendations and debriefing. Careful planning is essential
to success. For this reason, the facilitators should be in the country for three days
prior to the workshop for coordinated preparation.
• Although monitoring data is not a new concept, the practical aspects of applying
the DQS are new and sometimes difficult to understand if only using theoretical
concepts in a classroom-lecture style. Field work helps test the DQS but, just
as importantly, it gives the opportunity for participants from provincial and
national levels to witness the ground realities of immunization monitoring
systems. The in-class sessions themselves are most successful when participants
are actively encouraged to participate through a range of adult learning
techniques, such as simulations, practical exercises, games, illustrated lectures,
role plays, small group competitions and prizes.
• A good ice-breaking exercise consists of the monitoring-card game (day 1).
It consists of a series of 50 questions on monitoring systems which are asked
of the participants who should be split into groups. (An Excel workbook
presents these questions on cards for participants to randomly choose.) If a
group answers correctly, it is allowed to move (throwing a dice) on a
50-square game board and the participants can gently compete. The card
questions provide an excellent overview of the available tools and best practices,
and engender a spirit of camaraderie in the workshop.
• Two approaches can be envisaged during phase 1 of the workshop: (1) the
“start-from-scratch” approach, with an entirely self-devised assessment,
and (2) a “menu” approach with participants provided with a range of possible
qualitative questions and forms that are locally relevant and presented as a
menu of options from which to choose. In the menu approach, the questions
can be simultaneously pre-assigned to their proper categories (recording,
reporting, demographics, use of data, availability of forms, etc.) and structured
by subgroup so participants can effectively choose how to prioritize questions
and design their national questionnaire. For the accuracy component, the sources
of data and levels of analysis could also be presented as a series of options from
which the participants choose, with the possibility of revising or adaptating
them after field work. The first approach provides better ownership of the
process but is more time consuming and necessitates more intensive guidance
throughout the workshop. The second approach necessitates careful planning
and excellent understanding of the local situation prior to the workshop.
• If participants are able to use Excel, computers provide an opportunity to
learn how to create small databases, analyse data and create ways of displaying
data. This saves time when transferring data for presentations, and Microsoft
PowerPoint presentations can also enhance the efficiency of the workshop.
WHO/IVB/05.04 35
• Preparation: All facilitators should arrive in country early to permit three full
days of preparatory work before the workshop. This would permit two field
days for travel to several health centres in several districts to be able to give a
realistic overview of data flow from community up to national level, followed
by one day for revising the menu of qualitative and accuracy questions.
Promoting local facilitators is critical in encouraging ownership and
sustainability of the process. Facilitators should receive adequate briefing one
month in advance and should be allocated tasks so they can begin their individual
preparations. Where possible, presentations and session plans from prior
workshops should be shared widely.
• Because the field visits require good coordination, it is important that focal
point(s) for logistics in-classroom are appointed. The facilitators should know
who the focal points are and should have good channels of communication
with these individuals. At least a week prior to the workshop, the organizers
should prepare a detailed list of supplies needed; this should include an ample
quantity of office items such as markers, flipcharts, scissors, staplers, as well
as access to other necessary equipment (computers, printer, photocopier). Field
travel should be carefully coordinated focal point to assure adequate cars for
transport. The lead facilitator should be aware of the budget allocation in order
to address any possible budget constraints. The workshop can be integrated
with other health-sector monitoring issues as usually the same staff members
are busy with a variety of health data. Hence this would be a good opportunity
to explore whether DQS principles can be used for other health indicators.
After data collection on site, each team presents its findings and recommendations,
emphasizes the most important or urgent points;,suggests persons/parties who should
be responsible for follow-up action, and draws up a timetable of corresponding
activities.
In presenting its findings, the team should review the terms of reference, explain the
methodology used, summarize observations (supplemented by supportive objective
information), provide recommendations and acknowledge the contribution of
everyone who has helped to make the review a success. Any visual aids used during
the presentation should be shared for use during future meetings/training sessions.
Involvement of local partners and academic institutions. In order to build the capacity for a
country to perform data quality self-assessments and sustain the tool implementation, hence
maintain a high standard of monitoring practices, it is important to involve local partners and
academic institutions in a DQS workshop and its follow-up. This could also be a gateway for an
eventual extension of the tool to other health indicators.
The key to DQS success depends on the success of the workshop and first assessment.
A well-focused assessment should result in the following.
• A documented monitoring system: The tool should help managers to estimate
whether the information collected is reliable (accurate), and whether the
information is properly used (the monitoring system is of good quality).
• Identified weaknesses and strengths of the system: Major problems should
be localized.
• Recommendations for improvement of the performance of the system:
Monitoring immunization services is meaningful only if the information that is
produced can be used and leads to action. In particular, DQS results should be
used to:
− adjust district microplans accordingly;
− review the effectiveness of applied strategies;
− change priorities in the plan;
− guide future supervisory visits to focus on the issues found in the DQS.
Finally, recommendations should include ways to use and promote the DQS options
that the workshop participants developed. The goal of the DQS is to integrate the
tool into routine practice (sustainable self-assessment) which should be facilitated
by the fact that the tool is self-designed.
WHO/IVB/05.04 37
• Integrate the DQS concept into the national training schedule.
• Form a core team or designate a focal point to be responsible for follow-up of
DQS findings, help incorporate these into supportive supervisory visits, and
involve local partners and academic institutions.
• Include key quantitative DQS measures as core indicators at the district level.
• Integrate DQS measures and tools into Reach Every District (RED) workshops
and microplanning activities.
• Develop district-level DQS guidelines or a workbook (or integrate into existing
district material).
WHO/IVB/05.04 39
40
Sample chart for monitoring doses administered and drop-outs in children less than one year of age
150 150
138 138
125 125
113 113
100 100
88 88
75 75
63 63
50 50
38 38
25 25
13 13
0 0
Jan Feb Total Mar Total Apr Total May Total Jun Total Jul Total Aug Total Sep Total Oct Total Nov Total Dec Total
DTP1 10 12 22 7 29 12 41 14 55 15 70 14 84 7 91
DTP3 8 9 17 8 25 10 35 11 46 12 58 12 70 7 77
DO# 2 5 4 6 9 12 14 14
DO% 20 23 14 15 16 17 17 15
DO = drop-out
WHO/IVB/05.04 41
42
Example of a completeness/timeliness reporting table
Insert the date the HU reports were received at the district office. If a report is received after the deadline, enter the date in red.
Display the table in the district office.
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total Total
completeness a timeliness a
HU1
HU2
HU3
HU4
HU5
HU6
HU7
HU8
HU9
HU10
Total received this month (No.)
Total received this month (%)
Cumulative completeness b (%)
Total on time this month (No.)
Total on time this month (%)
Cumulative timeliness c (%)
Key
a
Total completeness or timeliness: refers to the reporting completeness of the selected HU. Can be filled in at the end of the year, or could be updated each month giving the
HU completeness and timeliness at each moment of the year. This may be easier in a computerized worksheet.
WHO/IVB/05.04 43
44
Standard questions to assess the quality of the monitoring system
WHO/IVB/05.04
(from
1 to 3)
45
46
Standard questions to assess the quality of the monitoring system (cont’d...)
WHO/IVB/05.04
(from
1 to 3)
47
48
Standard questions to assess the quality of the monitoring system (cont’d...)
WHO/IVB/05.04
(from
1 to 3)
Evidence of using data for action (district)
43 Is there an analysis of HU data performed regularly with HU staff? Analysis can be done within supervisory visits, meetings at district level, etc.
Explore the quality of analysis as well as the exhaustiveness of the HUs said to
be analysed: none of them should be left out.
44 Do you send regular monthly written feedback to the HUs?
45 Are areas of low access identified and evidence of action taken to deal with it? Discuss the importance and reasons for low access. How do the three strategies
(fixed site, outreach and mobile teams) relate to the issue of access in the district?
46 Have reasons for any high drop-out been identified, and are there plans/actions Are there managerial practices that could be changed to reduce the drop-
to deal with it? out rate?
47 Is there monitoring of HU vaccine stock-outs? The manager should be able to say (based on written information) whether any
(A stock-out is an interruption in vaccine supply [for any vaccine].) HU has encountered a vaccine stock-out. If no vaccine stock-out is reported,
ensure that the monitoring is possible and is being implemented.
Staff should be monitoring the level of reserve stocks and taking action if stock
goes below a specified reserve level.
48 Are there problems with completeness and timeliness of reports? Are the late or incomplete reports usually from the same HUs. What was done to
follow them up? What other actions were taken to encourage/induce timely
reporting.
49 Are the recommendations made for the last three supervisory visits followed up in
subsequent visits?
50 Has the monitoring of the selected immunization safety indicator been adequate during
the last 12 months?
51 Are surveillance and coverage data compared to look for inconsistencies and then
49
followed up to understand why?
50
Standard questions to assess the quality of the monitoring system
Were all vaccinations well registered on the child health card/tally sheet/register?
6 Are individual immunization records used, updated and given to the child’s caretaker Blank cards should be available in the HU. Immunization cards are often
at the time of the immunization visit? integrated in “Road to Health” or other health cards.
7 Are vaccine receipts recorded in a vaccine ledger? Check against available stock (count doses in the refrigerator).
8 Ask the child’s caretaker: Do you know the expected date of receiving vaccine? Find out whether the expected dates are known.
9 Is the ledger up to date for all vaccines and/or a selected vaccine? Up to date = all receipts and issues recorded immediately.
Check against stock (in therefrigerator). Compare the date of last entry and the
WHO/IVB/05.04
(from
1 to 3)
10 Is the receipt of a selected vaccine in the ledger complete for the entire year?
11 Is there a log (vaccine ledger/stock card) for receipt/issuing of syringes supplied Can perform a stock check.
(AD/non-AD reconstitution syringes)?
12 Does the HU record vaccine batch-number and expiry date?
13 Are all individual recording forms available for the entire previous year? Individual recording form = tally sheet or register.
14 Did every person doing the child immunization card exercise get a perfect score for: Need to define how the scoring will be if a perfect score is not obtained.
DTP1
DTP3
measles?
15 Is the cold chain temperature monitoring chart completed daily? Check the chart and compare the latest reported temperature with the actual
temperature in the refrigerator.
51
52
Standard questions to assess the quality of the monitoring system (cont’d...)
WHO/IVB/05.04
(from
1 to 3)
Archiving component (HU)
21 Can copies of all previous reports from this HU be found in the HU? For current and previous year.
22 Is there one location where the previous immunization reports and recording forms
are stored?
23 Are the reports of the HU organized in a file by date? The main concern is that the reports are easily retrievable.
24 Are HU reports available for the entire year?
25 Are the child registers available for all periods of the previous year?
26 Can all tally sheets covering the previous year be found?
27 Are registers for TT vaccinations to pregnant women available for the entire
previous year?
28 Is the latest feedback on data from district easily available?
53
54
Standard questions to assess the quality of the monitoring system (cont’d...)
WHO/IVB/05.04
(from
1 to 3)
Core outputs/analysis (HU)
33 Does the HU have a (target) number of children that it strives to vaccinate during a
calendar year or a reporting period?
34 Is there a mechanism in place to track defaulters? Can be an appropriate use of a correctly filled register, tickler file, etc.
When was the last time a child was followed up?
35 Does the HU have achievements split by type of strategy – fixed/outreach/mobile? It is important is to know the proportion of numbers actually reached by each
strategy.
36 Does the HU have an up-to-date chart or table (preferably on display) showing the Monitoring coverage chart – must be UP TO DATE.
number of vaccinations by report period for the current year?
37 Is there a monthly chart/graph of VPD cases (broken down by VPD)? How do these data correspond to coverage data (i.e. more cases in areas with
poor coverage). When was the last VPD outbreak? Was it investigated?
Why did it occur?
38 Does the HU monitor drop-out rate? Preferably on display with the same monitoring chart as the coverage one,
but score 1 if the health worker can tell you the drop-out rate for his HU.
Discuss the importance and reasons for drop-outs.
39 Does the HU monitor vaccine wastage? Discuss the reasons for wastage and any ways it might be reduced.
Discuss whether the health worker knows how much the vaccine wastage is
and how it can be calculated.
55
56
Standard questions to assess the quality of the monitoring system (cont’d...)
Child History
Child 1 due for DTP3, OPV3
BCG, DTP1, OPV1, DTP2, OPV2 given
Child old enough for OPV3/DTP3 vaccination
Child 2 due for BCG
Child born two days ago
Child 3 due for DTP2, OPV2
BCG, DTP1, OPV1 given
Child old enough for OPV2/DTP2 vaccination
Child 4 not due for any vaccination
BCG, DTP1, OPV1 given on schedule,
OPV2/DTP2 given only two weeks ago
Child old enough for OPV3/DTP3
Child 5 due for BCG, DTP1, OPV1
No vaccinations given
Child old enough for OPV1/DTP1 vaccination
Child 6 due for measles
BCG, DTP1, OPV1, DTP2, OPV2, DTP3, OPV3 given
Child old enough for measles vaccination
Child 7 due for BCG, DTP1, OPV1
No vaccinations given
Child old enough for OPV1/DTP1 vaccination
WHO/IVB/05.04 57
Child History
Child 8 due for DTP1, OPV1
BCG given at birth
Child old enough for OPV1/DTP1 vaccination
Child 9 due for measles
BCG, DTP1, OPV1 given on schedule, DP2/OPV2 given just
two weeks ago
Child old enough for measles vaccination
Child 10 due for DTP3, OPV3
BCG, DTP1, OPV1, DTP2, OPV2
Child old enough for OPV3/DTP3 vaccination
Child 11 due for BCG
No vaccinations
Child born 2 weeks ago
Child 12 due for BCG, DTP1, OPV1
No vaccinations
Child old enough for OPV2/DTP2 vaccination
Child 13 due for DTP1, OPV1
BCG at birth
Child old enough for OPV1/DTP1 vaccination
Child 14 due for DTP3, OPV3
BCG, DTP1, OPV1, DTP2, OPV2 given
Child old enough for OPV3/DTP3 vaccination
Child 15 due for DTP1, OPV1
BCG given late
Child old enough for OPV1/DTP1 vaccination
Child 16 due for DTP3, OPV3, measles
BCG late, DTP1, OPV1 late, DTP2, OPV2 late
Child old enough for measles vaccination
Child 17 due for measles
BCG, DTP1, OPV1, DTP2, OPV2, DTP3, OPV3 given
Child old enough for measles vaccination
Child 18 due for DTP3, OPV3, measles
BCG, DTP1, OPV1, DTP2, OPV2 given
Child old enough for measles vaccination
Child 19 due for DTP2, OPV2
BCG, DTP1, OPV1 given late
Child old enough for OPV3/DTP3 vaccination
Child 20 due for DTP2, OPV2
BCG, DTP1, OPV1 given
Child old enough for OPV2/DTP2 vaccination
1) Obtain the list of all HUs providing immunization services. This list is then
the sampling frame from which the sample is to be selected.
2) A sampling interval is then determined. The sampling interval is a number
used to systematically select HUs from the sampling frame. To determine the
sampling interval, take the total (all HUs) cumulative number of vaccinations
(in this example DTP3) divided by the number of HUs you want to sample
(say 6 in the following example).
In practice, make a table listing all the HUs in the district, and make a cumulative
total of their DTP3 vaccinations.
List of all health units with their respective DTP3 vaccination numbers,
and cumulative DTP3 totals
WHO/IVB/05.04 59
If there were a total of 592 486 doses of DTP3 given among 12 HUs available for
sampling, the sampling interval would be: 592 486 / 6 = 98 748 (which has five
digits). (Six is the number of HUs to be sampled.)
To select the first HU, firstly you choose a random number between one and the
sampling interval.
Step 1: Choose a direction (right, left, up or down) in which you will read the
numbers from the table.
Step 2: Select a starting point: close your eyes, and touch the random number table
with a pointed object. Open your eyes. The digit closest to the point where
you touched the table is the starting point. Check that the starting point
will give a number which is going to be less than or equal to the sampling
interval. If not, start again before going on to Step 3.
Step 3: Read the number of digits required (determined by the sampling interval)
in the direction chosen in step 1. Because each individual digit in the table is
random, the sequence(s) of digits can be used across spaces between the
five-digit numbers. The number you end up with is your random number.
For example, let us say you decided to read numbers to the right, and you
identified your starting point as the number 3 in row 01, column 8 (see the
table of random numbers in this Annex). If the sampling interval had four
digits, then your random number would be “3861”. The numbers “6” and
“1” come from row 01, column 9.
NOTE: Remember that the random number selected must be equal to or
smaller than the sampling interval. If it is not, then another random number
must be selected. You can decide (before selecting your starting point) a
direction to go to choose it (right, left, up or down from the first selected
digit).
In our example, column 3, row 07 of the random number table gave the
number 92780. The first selected HU will be Dundee, as it is the first HU
where the cumulative population listed for that HU will equal or exceed
the random number.
Step 4: Identify the second HU by adding the sampling interval to the random
number. The cumulative population listed for that HU will equal or exceed
the number you calculate. Repeat for subsequent HUs. In our example,
these will be:
92 780 + 98 748 = 191 528 Nyeri selected
191 528 + 98 748 = 290 276 Pokot selected
290 276 + 98 748 = 389 023 Rossem selected
389 023 + 98 748 = 487 771 Unison selected
487 771 + 98 748 = 586 519 Erpent selected
Column
Row 0 1 2 3 4 5 6 7 8 9
01 88008 13730 06504 37113 62248 04709 17481 77450 46438 61538
02 01309 13263 70850 11487 68136 06265 36402 06164 35106 77350
03 45896 59490 98462 11032 78613 78744 13478 72648 98769 28262
04 50107 24914 99266 23640 76977 31340 43878 23128 03536 01590
05 71163 52034 03287 86680 68794 94323 95879 75529 27370 68228
06 76445 87636 23392 01883 27880 09235 55886 37532 46542 01416
07 84130 99937 86667 92780 69283 73995 00941 65606 28855 86125
08 00642 10003 08917 74937 57338 62498 08681 28890 60738 81521
09 64478 94624 82914 00608 43587 95212 92406 63366 06609 77263
10 02379 83441 90151 14081 28858 68580 66009 17687 49511 37211
11 32525 44670 57715 38888 28199 80522 06532 48322 57247 46333
12 01976 16524 32784 48037 78933 50031 64123 83437 09474 73179
13 67952 41501 45383 78897 86627 07376 07061 40959 84155 88644
14 38473 83533 39754 90640 98083 39201 94259 87599 50787 75352
15 91079 93691 11606 49357 55363 98324 30250 20794 83946 08887
16 72830 10186 08121 28055 95788 03739 65182 68713 63290 57801
17 40947 75518 59323 64104 24926 85715 67332 49282 66781 92989
18 44088 70765 40826 74118 62567 75996 68126 88239 57143 06455
19 19154 29851 16968 66744 77786 82301 99585 23995 15725 64404
20 13206 90988 34929 14992 07902 23622 11858 84718 22186 35386
21 24102 13822 56106 13672 31473 75329 45731 47361 47713 99678
22 59863 62284 24742 21956 95299 24066 60121 78636 61805 39904
23 57389 70298 05173 48492 68455 77552 87048 16953 45811 22267
24 63741 76077 44579 66289 88263 54780 76661 90479 79388 15317
25 17417 56413 35733 27600 06266 76218 42258 35198 26953 08714
26 85797 58089 91501 34154 96277 83412 70244 58791 64774 75699
27 65145 97885 44847 37158 54385 38978 20127 40639 80977 73093
28 24436 65453 37073 81946 36871 97212 59592 85998 34897 97593
29 20891 03289 98203 05888 49306 88383 56912 12792 04498 20095
30 81253 41034 09730 53271 92515 08932 25983 69674 72824 04456
31 64337 64052 30113 05069 54535 01881 16357 72140 00903 45029
32 35929 76261 43784 19406 26714 96021 33162 30303 81940 91598
33 34525 54453 43516 48537 60593 11822 89695 80143 80351 33822
34 27506 45413 42176 94190 29987 90828 72361 29342 72406 44942
35 92413 00212 35474 22456 76958 85857 85692 75341 32682 00546
36 76304 57063 70591 06343 38828 15904 79837 46307 40836 69182
37 17680 92757 40299 98105 67139 01436 68094 78222 61283 40512
38 43281 36931 26091 42028 62718 38898 64356 19740 77068 78392
39 30647 40659 23679 04204 67628 81109 73155 68299 62768 58409
40 26840 42152 80242 57640 19189 47061 44640 52069 98038 49113
WHO/IVB/05.04 61
Annex F:
Data quality self-assessment
workshop schedule
Location: _______________________________________________
Dates: __________________________________________________
Day 1
Day 3:
WHO/IVB/05.04 63
Day 4:
AM: Pretest
Session 9: Finalization of the tool
14.30 Feedback
15.30 Revisions of the QQs and the forms
16.30 Assessment in practice
Days 5, 6, 7:
Day 8:
Day 9: