0% found this document useful (0 votes)
240 views20 pages

TSNA Consolidation Database User Manual V2

This document describes a database that consolidates and analyzes self-assessment data from Excel spreadsheets. The database allows users to import data from spreadsheets, generate reports and charts on the consolidated data, archive past data, and view trends over time. It provides filters to select specific subsets of the data for analysis based on fields like school level, grade, subject, or respondent characteristics. The consolidated data can be analyzed through features like individual respondent scores reports, average scores by domain or strand charts and tables, and division/district level reports.

Uploaded by

Ryan Teofel
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
240 views20 pages

TSNA Consolidation Database User Manual V2

This document describes a database that consolidates and analyzes self-assessment data from Excel spreadsheets. The database allows users to import data from spreadsheets, generate reports and charts on the consolidated data, archive past data, and view trends over time. It provides filters to select specific subsets of the data for analysis based on fields like school level, grade, subject, or respondent characteristics. The consolidated data can be analyzed through features like individual respondent scores reports, average scores by domain or strand charts and tables, and division/district level reports.

Uploaded by

Ryan Teofel
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 20

STRIVE

Strengthening
Implementation of
Visayas
Education

DepED

S ELF A SSESSMENT C ONSOLIDATION


D ATABASE M ANUAL
VERSION: 2
DATE: JULY 2010
CONTENTS
1 INTRODUCTION....................................................................................................................3

2 PREPARING SPREADSHEETS FOR THE DATABASE.......................................................3


2.1 Problematic Files.......................................................................................................3
2.2 Previous versions of the self assessment spreadsheet.............................................3
3 THE DATABASE....................................................................................................................3
3.1 The Main Menu..........................................................................................................3
3.2 Reading in the Data from the Self-Assessment Spreadsheets..................................4
3.2.1 Importing the Data from the Excel Spreadsheets.....................................4
3.2.2 Reading in Additional Data to the Database.............................................5
3.2.3 Errors in the Data......................................................................................5
3.3 Report Features.........................................................................................................6
3.4 Specifying the Data for the Report............................................................................6
3.4.1 Specifying Combinations of Data Fields...................................................7
3.4.2 Wildcard fields...........................................................................................7
3.4.3 School Level Specification........................................................................7
3.4.4 Grade Specification...................................................................................8
3.4.5 Subject Specification.................................................................................8
3.5 Reports and Charts Based on the Specified Data.....................................................9
3.5.1 “Scores Per Respondent” Report..............................................................9
3.5.2 “Average Score Per Domain” Chart..........................................................9
3.5.3 “Average Score Per Strand” Chart............................................................10
3.5.4 “Scores by Domains and Strands” Report...............................................10
3.5.5 “Scores by Division / District” Reports.......................................................11
3.6 The Archive and Trend Report Feature.....................................................................11
3.6.1 The archive feature...................................................................................12
3.6.2 Trend Reports based on Archived Data....................................................12
3.6.3 Chart Showing Trend in Archived Domain Scores....................................13
APPENDIX A. THE SCORES BY RESPONDENT REPORT...................................................14

APPENDIX B. REPORT OF SCORES BY DOMAINS AND STRANDS..................................17

APPENDIX C. SCORES BY DIVISION / DISTRICT REPORTS..............................................18


C.1 Scores by Division / District......................................................................................18
C.2 Scores for Each School per Division / District..........................................................19
APPENDIX D. REPORT BASED ON ARCHIVED DATA (TREND REPORT).........................20

V2 July 2010 2
1 Introduction
This manual describes a computer-based self assessment consolidation database. The database
consolidates and analyses the data from the Excel spreadsheets with which individuals complete a self
assessment (for example, the TSNA self assessment spreadsheets).
The database is a Microsoft Access database which can read the scores from a collection of individuals’
spreadsheets and provides a number of facilities to analyse these scores (individually and collectively) in the
form of tables, charts, and printed reports. The database also provides filters in selecting the data from the
reports, so that, for example, one could consider all the data from male respondents only, female
respondents only, and even look at the data for a particular individual.
The spreadsheet and database were originally developed for the self assessment of teachers in the
Philippines, by the STRIVE project, in collaboration with Regions VI, VII and VIII, the BESRA T&D TWG,
and other stakeholders.
The database is a single program file (“TSNA Consolidation Database V2.mdb”). Any data that has been
read into the database is also stored within that file.
In order for the database to read in the data from any collection of self assessment spreadsheets, they must
be organised as described in the next section.

2 Preparing Spreadsheets for the Database


In order for the data from a collection of spreadsheets representing individual self assessments to be read
into the database, the spreadsheets must be placed in a folder called SA Forms that resides in the same
folder as the database.
The spreadsheets must be “.xls” files, not “.xlsx”.
When consolidating the data (as described in section 3.2) the database will attempt to read in data from
every “.xls” file in the SA Forms folder.

2.1 Problematic Files


Any files in the SA Forms folder that are not “.xls” files will simply be ignored by the database when it reads
in the spreadsheets. Excel spreadsheets that are not self assessment spreadsheets will lead to an error
condition. If this occurs, the offending files should be removed from the SA Forms folder and the data
should be consolidated into the database again, by repeating the instructions in section 3.2.

2.2 Previous versions of the self assessment spreadsheet


In addition to the most recent version of the self assessment spreadsheet (issued in May 2010), the
database will process an earlier version that was issued in 2009. Previous and current versions of the
spreadsheet should be placed together in the SA Forms folder. When consolidating the data (see section
3.2) the database can automatically determine the spreadsheet version, and will read in the data from each
spreadsheet accordingly.

3 The Database
The current version of the Microsoft Access database is in Access 2003 format, so can be edited in Access
2003 or 2007. The preparation of spreadsheets that are to be read in to the database was described in
section 2.
This section describes the features available with the database. In the examples given here, the data from 8
(fictitious) respondents has been read in to the database.
The current filename of the database is TSNA Consolidation Database V2.mdb. The database opens to a
simple main menu, described next.

3.1 The Main Menu


The main menu is shown in Figure 1.

V2 July 2010 3
FIGURE 1. THE MAIN MENU

The option Consolidate Self-Assessment Data is described in section 3.2. The Reports option is
described in section 3.3. The Archive / Trend Reports feature is described in section 3.6. The Display List
of Teachers and SA Spreadsheets option is described in section 3.7.

3.2 Reading in the Data from the Self-Assessment Spreadsheets


Once a collection of self assessment spreadsheets is set up properly, as described in section 2, the option
Consolidate Self-Assessment Data on the main menu (see section 3.1.) is used to import the data from
those spreadsheets into the database.

3.2.1 Importing the Data from the Excel Spreadsheets


Clicking the button Consolidate Self-Assessment Data leads to the pop up window shown in Figure 2.

FIGURE 2. CONSOLIDATE THE SELF ASSESSMENT DATA

When the Import Excel Data button is clicked, the relevant data is extracted from the spreadsheets, and a
message appears as shown in Figure 3.

FIGURE 3. MESSAGE AFTER SUCCESSFUL DATA CONSOLIDATION

V2 July 2010 4
As can be seen in Figure 3, the message indicates how many spreadsheets were processed (in this case 8).
Once the data is in the database, it remains there, even if the database is closed (no “Save” action required).
All data will remain in the database until another collection of spreadsheets is imported.

3.2.2 Reading in Additional Data to the Database


Note that importing data from a collection of self assessment spreadsheets overwrites any data that is in the
database (except for the Archived data - see section 3.6). However, if you wish to add new data to the
existing data in the database, simply include the additional spreadsheets and the original spreadsheets in the
SA Forms folder (section 2) before carrying out the consolidation again.

3.2.3 Errors in the Data


When the database imports the data from a collection of spreadsheets in the SA Forms folder (see section
2) it checks to see if the number of teachers records read in is the same as the number of spreadsheets. If
the number of teacher records is less than the number of spreadsheets, this indicates that the same
Teacher ID has been used in the teacher profile section of two or spreadsheets. When this condition occurs,
the message box shown in Figure 4 is displayed.

FIGURE 4. ERROR MESSAGE WHEN TEACHER ID DUPLICATES ARE FOUND

On clicking the OK button in Figure 4, the user is presented with a screen such as that shown in Figure 5.

FIGURE 5. SCREEN TO VIEW ERRORS CAUSED BY DUPLICATE TEACHER ID

V2 July 2010 5
The list of teachers in Figure 5 is organised in ascending order of Teacher ID (the Teacher column, i.e. the
leftmost column of the list). The user can scan this column to determine where duplicate IDs have occurred.
In Figure 5, it can be seen that the code 90 is associated with teachers from two different spreadsheets.
Note that the data in the database is highly unreliable until all errors that can be detected from the screen in
Figure 5 have been corrected.
As explained in the text in the upper part of the screen shown in Figure 5, the user can click on a
spreadsheet filename (rightmost column of the list) to open the relevant spreadsheet in Microsoft Excel. The
spreadsheet can then be edited to rectify the problem with the teacher IDs. Once all duplicate IDs have been
removed the data can once again be consolidated (as described in section 3.2.1).

3.3 Report Features


The report features are accessed from the Reports button on the main menu (section 3.1.). Clicking the
button displays the dialogue box shown in Figure 6.

FIGURE 6. THE REPORT DIALOGUE BOX

The main area of the screen, above the row of large buttons, is where the user specifies the subset of the
data to be the basis of the report. This, along with the Reset Values button (lower left of Figure 6), is
described in section 3.4, following which the functions of the 5 large buttons are described.

3.4 Specifying the Data for the Report


The fields in the upper part of the report dialogue box (Figure 6) can be used to specify subsets of the data
to be used in the reports. If the user makes no changes to these fields, the reports generated by the five
large buttons to the right of the Reset Values button will be based on all of the data in the database.

V2 July 2010 6
3.4.1 Specifying Combinations of Data Fields
Combinations of fields can be used, so, for example, if the chosen gender is Female (see Figure 7) and the
chosen experience is 16-25 Years (Figure 8), the reports generated will feature only Female respondents
who have between 16 and 25 years of experience.1

FIGURE 7. CHOOSING THE GENDER


FIGURE 8. CHOOSING RANGE OF EXPERIENCE

Note from Figure 7 and Figure 8 that some fields have pull down lists of acceptable values. Some of these
lists are generated according to the data from the self assessment spreadsheets imported in to the
database. For example, Figure 9 shows a drop down list of 8 names that were imported as examples for this
manual.

FIGURE 9. AN EXAMPLE LIST OF RESPONDENTS

Note that names are listed in the form Family Name, First Name, Middle Initial. Names are automatically
listed in ascending alphabetical order of family name.
Thus, it is possible to generate a report on a single individual, if desired.
Pressing the button Reset Values (see Figure 6) sets all the fields back to the values shown in Figure 6, and
thus means that any reports then generated will use all of the data.

3.4.2 Wildcard fields


It can be seen that many of the fields in Figure 6 contain only an asterisk (*). An asterisk is known as a
wildcard and can stand for any sequence (possibly empty) of characters, numbers and spaces. For example,
if the user enters M* into the name field, it would stand for “all last names beginning with M”. For the
example list of names in Figure 9, this would specify the three records containing last names beginning with
“M”. More than one asterisk can be used in such a field.
It is important to realise that if a text field is empty no data will be retrieved at all. For example, if the Gender
box is empty, no records will be retrieved because there is no record in which the Gender field is empty. If
the Gender field contains an asterisk, records for male and female respondents will be retrieved. However,
since clicking the Reset Values button (Figure 6) will set all the fields back to their default values, this need
not concern the user too much.

3.4.3 School Level Specification


The check boxes Elementary and Secondary are used to specify the school level. The default value is that
both boxes are checked (as in Figure 6). In this case, records for respondents who teach in either levels of
school and in integrated schools will be retrieved.
When only one of the boxes is checked, only schools of that level and integrated schools will be considered.
1
The band for experience is calculated according to the number of years between the respondent’s month and year of starting, as entered
on the profile worksheet in the self assessment spreadsheet, and the current date.

V2 July 2010 7
Note that the set of records can be made more specific by also choosing the grades that the respondents
teach, as described next.

3.4.4 Grade Specification


The Grade Level Taught option, Figure 6, leads to a drop down list of combinations of grades, a part of
which can be seen in Figure 10.

FIGURE 10. PART OF THE GRADES DROP DOWN LIST

A single grade or a list of grades can be selected, such as 2,3. Then the records retrieved for the report will
be all those in which the respondent teaches grades 2 and 3 (among the other grades they teach, if any). It
will not retrieve records for teachers who teach only one of the grades. For example, a respondent who
teaches grades 3 and 4 only would not be retrieved, but a teacher who teaches grades 1, 2 and 3 would be.
The drop down list of grade combinations is not exhaustive, but the user can enter a list in the field without
using the drop down list, as long as it is entered with no spaces and commas separating the values, e.g. 2,4.

3.4.5 Subject Specification


The Subjects Taught section of the report data specification dialogue box in Figure 6 is used to specify
combinations of subjects that the respondents teach. An example configuration can be seen in Figure 11.

FIGURE 11. A COMBINATION OF SUBJECTS SPECIFIED

In Figure 11, the user has checked the boxes for Math, Science and the Makabayan subject Home Ec. and
Livelihood / Technology and Home Ec.. These are the corresponding Makabayan subjects for Elementary
and Secondary Respectively.
The records retrieved would be those for teachers who teach the three subjects specified. Note that the
check box Exact Subject Match is specified. This means that the only records retrieved would be for
respondents who teach exactly those subjects (and no others). Un-checking the box would specify records
for respondents who teach those three subjects and perhaps other subjects.

V2 July 2010 8
3.5 Reports and Charts Based on the Specified Data
This section describes the reports available that are based on the data specified using the fields in Figure 6.
As stated earlier, when the fields values are as they are in Figure 6 (which is the state of the fields when the
dialogue box is opened, or immediately after the Reset Values button is clicked), reports are based on the
data for all respondents. If a combination of one or more fields are set to other values, the reports generated
will be based on the specified data.

3.5.1 “Scores Per Respondent” Report


The Scores Per Respondent button (Figure 6), generates a report as shown in Appendix A (the one shown
includes all respondents in the example). The report gives a detailed breakdown of the scores (domain,
strand) for each respondent, and overall averages, also by domain and strand. Also, as shown in Appendix
A, the user can find out exactly which respondents fall within each band (expert, experienced, developing,
beginning) for each domain.
See Appendix A for further discussion of features available with this report.

3.5.2 “Average Score Per Domain” Chart


The Average Score Per Domain button (Figure 6), generates a chart as shown in Figure 12. In this case,
we have specified females only, as in Figure 7, so we have only the average scores for female respondents
(see the report in Appendix A to confirm that there are 4 female respondents in the example data here). You
will note that it says that the number of respondents who feature in the chart is 4 (lower right) and that, below
the chart, the fact that the only data field we have specified is Gender, which is indicated to be set to
Female. A similar section providing the value of all the data specification fields appears in all reports and
charts that depend on these values (see, for example, the report in Appendix A).

FIGURE 12. SCORES BY DOMAIN CHART (FEMALE RESPONDENTS ONLY)

The button to the lower right hand side of the chart in Figure 12 (Domains), lists the domains (and their
codes) for the data currently in the database. Thus, the user can relate each code on the x axis of the chart
to the corresponding domain. Appendix A shows how the domain list appears when the Domains button is
clicked.

V2 July 2010 9
3.5.3 “Average Score Per Strand” Chart
The button Average Score Per Strand (Figure 6) results in a chart showing average scores (for the
selected subset of the data) by strand. An example is shown in Figure 13.
The chart in Figure 13 is based on the whole of the example data set (8 respondents).
Analogously to the scores by domains chart (section 3.5.2), this time the user can click the Strands button
to see a list of strands and their codes. Appendix A shows how the strand list appears when the Strands
button is clicked.

FIGURE 13. AVERAGE SCORE BY STRAND (ALL DATA)

3.5.4 “Scores by Domains and Strands” Report


The next report that is affected by the specification of data on the dialogue box in Figure 6 is invoked by
clicking the Overall Scores for Domains and Strands button.
When the button is clicked, the small dialogue box in Figure 14 appears.

FIGURE 14. THE REPORT DIALOGUE BOX FOR DOMAIN & STRAND REPORT

V2 July 2010 10
In the dialogue box in Figure 14, the button Preview/Print Report shows a print preview of the report, which
can then be printed. An example of the report (once again for all of the female respondents) is shown in
Appendix B.
The button Save Report as a File enables the user to save the report in a variety of formats (rtf - which can
be opened in Microsoft Word, pdf, html, etc). According to the chosen format, the file is immediately opened
in the user’s default application for opening that type of file2. As is usual in Windows, when you save the file
you can select a location in which to save the file, and also edit the default filename to a filename you
choose.

3.5.5 “Scores by Division / District” Reports


Two reports can be generated from this option (button at the lower right hand side of Figure 6): a report of
Scores for strands and domains by Division and District, and a report of Scores for strands and
domains for each school by Division and District.
As for the other reports described above, the dataset depends on the specification in the dialogue box shown
in Figure 6.
On clicking the button described immediately above, the dialogue box shown in Figure 15 appears.

FIGURE 15. DISTRICT REPORTS DIALOGUE BOX

As for the dialogue box discussed in the previous section (see Figure 14), the user has the option to preview
(for subsequent printing) the report and/or save it as a file. In this case two reports are available, as was
discussed above and can be seen from Figure 15.
An example of the report preview generated by clicking the button Preview or Print Division / District
Report (top left button in Figure 15) can be found in Appendix C (part C.1). This report is organised by
division, showing overall (average) scores per strand and domain for each district. The average score for
each division as a whole is also shown, as is the overall score for all divisions in the current dataset.
An example report preview generated by clicking the button Preview or Print Division / District Report
can be seen in Appendix C (part C.2). The report follows the structure of the report described immediately
above, but in this case the overall scores for each school within each district are also shown.

3.6 The Archive and Trend Report Feature


The archive and trend reports facilities are accessed by using the Archive / Trend Reports button on the
main menu (Figure 1). Clicking the button leads to the dialogue box in Figure 16.

2
This is likely to be Microsoft Word for rtf; Acrobat Reader for pdf; the user’s default browser for html, and so on.

V2 July 2010 11
FIGURE 16. ACCESS TO ARCHIVE AND TREND REPORT FEATURES

These facilities are described in this section. Section 3.6.1 describes the archiving feature. Sections 3.6.2
and 3.6.3 describe the trend report and chart available via the Trend Report and Chart... button.

3.6.1 The archive feature


The feature to manage the archiving of sets of data is accessed via the Archive all Data button in the
dialogue box shown in Figure 16. The dialogue box shown in Figure 17 then appears.

FIGURE 17. THE DIALOGUE BOX FOR MANAGING ARCHIVE DATA

To archive data, the user enters the month and year (using the drop down lists provided). Both month and
year must be specified, thus it is possible (but not necessary) to archive different sets of data for different
months in the same year. Then the Archive Data button is clicked and the data is saved in the database.
The user is informed if there are already 5 sets of archived data, and no changes are made.
However, archived sets of data can be removed using the Remove Earliest Archive Set button (Figure 17).
As its name suggests, this button removes from the archive the earliest dated set of archived data, leaving
the remaining sets intact. This makes room for another set to be added, if required. If there is no data to
remove, a message appears to this effect.
Archiving a set of data does not change the current respondent data in the database. This will not change
until a further set of spreadsheets is imported. Moreover, as stated earlier, importing additional data from self
assessment spreadsheets does not change the archive.
The archive, like all other data, is stored inside the database, and thus remains intact when the database is
closed and reopened.

3.6.2 Trend Reports based on Archived Data


This facility, invoked using the button Trend Report (based on Archived Data) shown in Figure 16,
generates a report based on all of the (up to 5) current sets of archived data. On clicking the button, the
dialogue box in Figure 18 appears.

V2 July 2010 12
FIGURE 18. TREND REPORT MENU

The Preview / Print Trend Report button in Figure 18 displays a print preview of a report, such as the
example report shown in Appendix D, in this case for an archive consisting of 3 sets of data. If there are
fewer sets, then fewer columns of data will be generated in the report. As stated before, up to 5 sets of data
can be archived.
This report enables the comparison of scores for domains and strands for a given set of respondents over a
period of time.
The Save Report as a File saves the report in a chosen file format, in a similar way to the button described
in section 3.5.4.

3.6.3 Chart Showing Trend in Archived Domain Scores


The button Trend Chart for Domain Scores (from Figure 18) displays a chart such as that shown in Figure
19.
The chart shows the average domain scores for archived data. For each domain, there is a column for each
archived set of data (here, there are three such sets, as in the trend report in Appendix D). For each domain,
the columns are in date order (earliest to latest).

FIGURE 19. TREND IN DOMAIN SCORES

V2 July 2010 13
APPENDIX A. The Scores by Respondent Report
(See section 3.5.1)

V2 July 2010
As can be seen above, the report is in two parts.
The upper part of the screen gives the scores (by strand and domain) for each respondent. The domain
scores are colour-coded, indicating which range (low, fair, high, very high) they fall within. The report is
sorted by respondent name (last name, then first name, then middle initial).
Note the instruction immediately above the list of respondents, to the left: Click on resp. code… . The user
can click on the code (ID) for a respondent to open the self assessment spreadsheet for that respondent in
Microsoft Excel. The database remains open with the same screen on display while the spreadsheet is
displayed.
The lower part of the screen gives a summary of average scores per domain and strand. The coloured
boxes beneath the domain and strand scores indicate the number of respondents in each score band. These
boxes can be clicked to see a list of actual respondents falling within that band. The image below shows the
pop up window that appears when the lowest but one of the three coloured boxes (it has the number 7 next
to it) beneath domain 1 is clicked:

The two buttons towards the lower part of the screen (see full screen illustration, earlier), labelled Domains
and Strands, enable the user to display the codes and titles for the domains and strands, respectively, as
pop up windows, as can be seen in the next screen:

V2 July 2010 15
V2 July 2010 16
APPENDIX B. Report of Scores by Domains and Strands
(See section 3.5.4) The version shown below is for female respondents only, as indicated by the term
Female next to the term Gender in the area beneath the title. Hence, number of Respondents is indicated to
be 4.

V2 July 2010 17
APPENDIX C. Scores by Division / District Reports

C.1 Scores by Division / District

As can be seen from the report depicted above, the (average) scores for strands and domains are summarised by district within each division. The overall
scores for each division follow the list of districts in the division. The overall scores for the whole dataset are presented at the end of the report.

V2 July 2010 18
C.2 Scores for Each School per Division / District

In the case partly depicted above, as for the report in section C.1, the strand and domain average scores per district within each division are presented, as are
the overall averages for each strand and domain for each division (the scores for the whole dataset also conclude the report). However, in this case the average
score for each school within each district is also displayed.

V2 July 2010 19
APPENDIX D. Report based on Archived Data (Trend Report)
(See section 3.6.2. This particular report is based on 3 sets of archived data. Column headings represent
year and month).

V2 July 2010 20

You might also like