Measuring Monitoring Reporting Performance
Measuring Monitoring Reporting Performance
Measuring Monitoring Reporting Performance
Contact
For further information or advice, contact:
Performance Unit
Cabinet Services
Department of the Premier and Cabinet
Email: pm@premiers.qld.gov.au
Telephone: 07 3003 9192
Contents
Introduction .............................................................................................................................................................3
Background ..........................................................................................................................................................3
Purpose ................................................................................................................................................................3
Application ...........................................................................................................................................................3
Developing performance information .....................................................................................................................4
General principles in developing performance measures ...................................................................................5
Setting targets for measures ................................................................................................................................5
Data quality ..........................................................................................................................................................7
Measuring performance ..........................................................................................................................................8
Performance indicators........................................................................................................................................8
Service standards .................................................................................................................................................8
Effectiveness measures ....................................................................................................................................9
Efficiency measures ..........................................................................................................................................9
Economy measures ........................................................................................................................................10
Other measures of performance .......................................................................................................................10
Activity measures ...........................................................................................................................................10
Cost measures ................................................................................................................................................10
Process measures ...........................................................................................................................................10
Input measures...............................................................................................................................................11
Quality measures............................................................................................................................................11
Location measures..........................................................................................................................................11
Timeliness measures ......................................................................................................................................11
Equity measures .............................................................................................................................................11
Measuring benefits ............................................................................................................................................11
Monitoring performance .......................................................................................................................................12
Performance reviews .........................................................................................................................................12
Integration and alignment of performance information ...................................................................................12
Evaluation of policies and related services ........................................................................................................13
Performance analysis .........................................................................................................................................13
Benchmarking ....................................................................................................................................................14
Performance reporting ..........................................................................................................................................15
Mandatory public reporting requirements ........................................................................................................15
Better practice reporting principles ...................................................................................................................16
Ensuring fair and balanced reporting.................................................................................................................16
Page 2 of 16
Introduction
Background
The Queensland Government Performance Management Framework (PMF) is designed to improve the analysis and
application of performance information to support accountability, inform policy development and implementation and
create value for customers, stakeholders and the community. The PMF enables a clear line of sight between planning,
measuring and monitoring performance and public reporting.
The community expect that the public sector will deliver services that are of value to them, and that the services are
delivered in a manner that upholds public sector ethics principles. Generating value will influence trust and confidence in
public services.
Through measuring and monitoring results, agencies gain a better understanding of the external drivers of whole-of-
Government direction, how the external drivers translate into the Government’s objectives for the community, priorities
and strategies and, in turn, how this affects the purpose, vision and objectives of each agency. This clear line of sight
enables agencies to identify which services need to be delivered to meet the needs of its customers, stakeholders and the
community.
How the actual delivery of these services creates value for its customers, stakeholders and the community should be
measured, and this information should be used to improve policy development and implementation, and future
service delivery.
As public sector agencies spend taxpayers’ money to deliver services, it is important that they are held accountable for
performance, as required by the Public Sector Ethics Act 1994 (PSEA). The PMF, through the Financial Accountability Act
2009 (FAA) and subordinate legislation, addresses this by requiring agencies to publicly report results – through annual
reports and the Service Delivery Statements (SDS).
Purpose
This guide aims to provide agencies with information to assist in measuring and monitoring performance and preparing
public reports.
Application
This guide applies to all Queensland Government departments and statutory bodies.
In most cases, the term ‘agency’ is used in this guide to refer to departments and statutory bodies. When necessary, an
indication is made if specific sections apply to departments only or statutory bodies only.
Page 3 of 16
Developing performance information
What gets measured gets done, what gets measured and fed back
gets done well, what gets rewarded gets repeated (John E Jones)
Good performance information helps identify what policies and processes work and why they work. Making the best use
of available data and knowledge is critical to improving the performance of government as a whole. Performance
information is key to effective management, including strategic and operational planning, monitoring and evaluation.
It is almost impossible to have the perfect performance measure – defining measures, setting targets and collecting
performance information is a balancing act between using the ideal information and using what is possible, available,
affordable, and most appropriate to the particular circumstances.
Suggested reference
PMF Reference Guide – Developing Performance Information
Department of the Premier and Cabinet
Page 4 of 16
General principles in developing performance measures
A performance measure should be:
relevant – the measure should reflect what the agency is trying to achieve – not simply what is easy to measure
attributable – the activity measured must be capable of being influenced (not necessarily fully controlled) by
actions which can be attributable to the agency or more broadly by government, and it should be clear where
accountability lies
comparable – with either past periods or similar services in other jurisdictions
well-defined – with a clear, unambiguous definition so that data will be collected and the measure is easy to
understand and use with minimal explanation. Clear documentation of measurement processes should be
maintained (for example in a data dictionary)
timely – performance data should be produced regularly enough to track progress and, quickly enough for the
data to still be of value for decision-making
prevent unintended consequences – not encourage unwanted or wasteful behaviour
reliable and verifiable – able to produce accurate data for its intended use, able to be measured consistently and
be responsive to change
time-framed – it should be clear when the activity measured should be delivered by
achievable – the measure should be stretching, and reflect the Government’s ambitions for improved standards of
public services. However it must be achievable within the agency’s available resources
credible – a measure that has the support of stakeholders and where appropriate, is supported by research and/or
established industry standards
cost-effective – in terms of gathering and processing the data.
Targets aid accountability. The aim of targets is to set a level of performance acceptable to government on behalf of the
community within fiscal limits. Setting target levels is a complex task as the establishment of a target can raise as many
questions as it answers.
Targets should present clear and quantified levels of performance against which agencies can assess their results or
indicate the desired movement of performance. Targets must be expressed as absolute numbers, a range, a percentage,
or a ratio.
Agencies may use a combination of methods to establish targets for measures. Common approaches include:
• current performance
• current performance plus/minus a percentage improvement change
• averaged performance (national, state, or industry)
• better practice – if quantifiable benchmarks exist that are considered directly relevant to the activity being measured
• external targets established by professional associations
• external targets set in intergovernmental agreements
• management decisions – calculated decisions given resource (in particular, staffing) limitations.
Page 5 of 16
To ensure that targets are not unrealistic or create perverse incentives:
• targets should be set through agency planning processes
• proposed targets should be trialled in parallel to existing targets
• targets should be presented in the context of the service being delivered (not in isolation).
Targets should be challenging but achievable. Stretch targets should be set rather than setting unrealistic targets. For
example, targets for customer satisfaction should not be set at 100%. It is not reasonable to believe that every customer
will be completely satisfied with a service provided by an agency.
Consultation throughout the agency – with service delivery staff in particular – should occur in the target setting process.
Staff at all levels should be clear about their role and responsibilities in the performance against targets, and be held
accountable in some way, for example through individual or team performance objectives. In particular, the individuals
who are best placed to ensure the achievement of a target must feel ownership and responsibility.
As customers, stakeholders and the community are affected by an agency’s business and the services it provides, agencies
should consider including them in the development and/or review process of setting targets where appropriate.
Consultation with customers, stakeholders and the community helps to establish targets that are meaningful and useful
for decision makers.
If achievement of a publicly reported target becomes impractical or not feasible, the agency should explain why that is
the case and what legislative, regulatory, or other actions are needed to accomplish the target, whether the target should
be modified or if the performance measure and target should be discontinued.
Target checklist
Target does not promote adverse results (e.g. efficiency improves to a level that substantially decreases quality)
Target is challenging, but achievable based on judgement of available information at the time of setting targets
Target is a clear and quantified measure against which the agency can assess performance
Target is expressed as an absolute number (i.e. avoid use of words), a range, a percentage, or a ratio
Target is consistent with objectives and targets set in other government publications
Page 6 of 16
Data quality
In a number of recent audit reports, the Auditor-General has identified issues with the accuracy of externally reported
performance information stating that absent or weak controls over non-financial performance information in agencies
has led to inaccurate reporting of performance information (Report 3: 2016-17 Follow-up: Monitoring and reporting
performance, p32).
Data assurance
Data assurance arrangements for performance information should include adequate documentation of data sources,
collection methods, standards and procedures and clear management trails of data calculations.
A data dictionary is a tool that is used to document the meaning and context of a performance measure, including how
the measure is compiled, how it should be interpreted and reviewed, allocation of responsibility and identification of any
limitations and data risks.
The Australian Bureau of Statistics Data Quality Framework (ABS DQF) provides standards for assessing and reporting on
the quality of statistical information. It is a tool that improves a user's ability to: decide whether a data set or statistical
product is fit for purpose (which in turn helps to identify data gaps); assess the data quality of seemingly similar
collections; and interpret data.
The ABS DQF is designed for use by a range of data users and providers in different settings, including government
agencies, statistical agencies and independent research agencies. For example, the ABS DQF is used to assess the quality
of performance indicator data linked to a number of intergovernmental agreements in key policy areas.
Suggested reference
Page 7 of 16
Measuring performance
Performance information needs to be collected and used at all levels in an agency. Performance information should help
to understand how well the agency, parts of the agency, and individuals are performing.
Performance information should help to inform decision-making, as well as describing whether the required level of
performance has been achieved.
Performance indicators
Performance indicators show the extent to which the outcomes achieved by an agency are meeting the objectives in the
agency’s strategic plan.
When dealing with outcomes, direct measures are often difficult – for this reason measures can often only ‘indicate’ the
outcome rather than directly measure it. Often it takes more than one performance indicator to adequately capture an
outcome.
The Financial and Performance Management Standard 2019 (FPMS) (section 10(2)(c-e)) describes the need for an
agency’s objectives to be delivered ‘efficiently, effectively and economically’.
Each agency objective in its strategic plan must have one or more relevant and appropriate performance indicators. Some
performance indicators may be relevant for more than one agency objective.
Reporting actual results against the performance indicators should demonstrate the extent to which the agency objective
is being achieved. Agencies are encouraged to develop and set targets for performance indicators where possible.
Knowing how well the agency is currently performing against its objectives is essential to determine if the agency needs
to alter it strategies or policies, or re-evaluate its objectives to ensure value is delivered to its customers, stakeholders
and the community.
Suggested reference
Service standards
The FPMS describes the need for an accountable officer or statutory body to monitor whether the agency objectives are
being achieved ‘efficiently, effectively and economically’ and for an agency to deliver ‘the services stated in its
operational plan to the standard stated in the plan’.
The Service Delivery Statements (SDS) presents a Service Performance Statement in which each agency details their non-
financial performance information. This performance information includes a selection of service standards for each
service area or service to demonstrate the efficiency or effectiveness of service delivery.
Service standards are set with the aim of defining a level of performance that is appropriate and expected to be achieved,
enabling government and the public to assess whether or not agencies are delivering services to acceptable levels of
efficiency and effectiveness.
All agencies should have a balanced set of service standards, including a combination of quantitative and qualitative
measures, which incorporate better practice characteristics. Collectively, service standards should provide good coverage
of the service area/service (including coverage against funds used) by the agency.
It is crucial that the service performance data collected is accurate and can be relied upon as a valid assessment of an
agency’s performance.
Page 8 of 16
The approach to service standards in the PMF is consistent with the general performance indicator framework used in the
Report on Government Services (RoGS). The service logic diagram shown in Figure 1 is widely used in Australia to depict
the concepts underpinning the framework used for measuring and managing the non-financial performance of services.
External
influences
Service
Efficiency
Cost-effectiveness
Service effectiveness
Effectiveness measures
Effectiveness measures reflect how well the actual outputs of a service achieve the agency’s stated purpose (objective),
describing the quantifiable extent of the outcome experienced by recipients as a result of the level and quality of the
service provided.
Effectiveness is often measured through customer and/or stakeholder satisfaction/experience surveys. To be considered
a proxy measure of effectiveness, the survey must seek feedback on all drivers of satisfaction. Feedback on a single driver
of satisfaction such as timeliness is a measure of quality, not effectiveness. More detailed information on measuring
customer satisfaction is provided in the PMF Reference Guide – Measuring customer experience.
Suggested references
PMF Service Delivery Statements: Performance Statement Better Practice Guide
Department of the Premier and Cabinet
Efficiency measures
Efficiency measures reflect how capabilities (resources) are used to produce outputs for the purpose of achieving desired
outcomes. They are expressed as a ratio of capabilities (resources) to outputs.
Page 9 of 16
The concept of efficiency has three dimensions. The Report on Government Services describes overall economic efficiency
as requiring satisfaction of technical, allocative and dynamic efficiency:
• ‘technical efficiency requires that goods and services be produced at the lowest possible cost
• allocative efficiency requires the production of the set of goods and services that consumers value most, from a
given set of resources
• dynamic efficiency means that, over time, consumers are offered new and better products, and existing products
at lower cost’.
While measuring efficiency is important, it needs to be in conjunction with measuring effectiveness. Government services
which are provided efficiently may not necessarily be meeting customer, stakeholder or the broader community’s needs.
Suggested reference
Economy measures
While a requirement of the financial legislation (FAA and FPMS), neither provide a definition of ‘economy’. However there
are various definitions that can be drawn on. The Australian National Audit Office (ANAO) defines economy as ‘minimising
cost’ (Performance Auditing in the Australian National Audit Office, p.3). Similarly, definitions of ‘economical’ include:
• ‘careful, efficient, and prudent use of resources’ (Merriam-Webster Dictionary, 2017)
• ‘giving good value or return in relation to the money, time, or effort expended; careful not to waste money or
resources; using no more of something than is necessary’ (Oxford Dictionaries Online, 2017).
Given these definitions, the approach to measuring the efficiency and effectiveness of service delivery detailed above, will
also measure the economical aspect of service delivery.
Activity measures
Measure the number of service instances, service recipients, or other activities for the service. They demonstrate the
volume of work being undertaken. They are generally measures of busyness. While not generally demonstrating the
achievement of service objectives, activity measures provide a basis for judging whether an agency is contributing to the
desired social change of the service being delivered. Activity measures can often be converted into efficiency measures by
combining them with input measures to show the unit cost of the activity.
Cost measures
Measure the cost of outputs/services produced (direct and/or fully absorbed). Ideally, the outputs are uniform and the
cost per unit of output provides an obvious benchmark for measuring performance both over time and between like
service providers. However, such uniformity is not always possible. For example, average cost of school per student,
average cost of processing application.
Process measures
Measure throughput, or the means by which the agency delivers the activity or service, rather than the service itself.
Process measures demonstrate how efficiently services are delivered, rather than how effectively services are delivered,
and are sometimes used as proxies for effectiveness measures if it is impractical or uneconomical to measure the
effectiveness of the service or its outcome.
Page 10 of 16
Input measures
Measure the resources consumed in delivering a service, either as an absolute figure or as a percentage of total
resources. Input measures may be measured in terms of funding, number of employees, person-days, equipment,
supplies etc, and can often be converted to efficiency measures by combining them with activity measures to show the
unit cost of the activity.
Quality measures
Measure how well a service is fit for purpose, for example, extent to which outputs conform to specifications. The quality
of a service can be measured using specific criteria (timeliness, accuracy, completeness, accessibility and equity of access,
continuity of supply, and/or seeking feedback on one of these criteria through customer satisfaction surveys). Quality
itself is one dimension of effectiveness, but does not necessarily fully represent how effective a service is (e.g. a service
could be high quality, but still not effective).
Location measures
Measure where the service is delivered. This is usually as a measure of access and equity for customers in rural remote or
targeted locations.
Timeliness measures
Measure the time taken to produce an output and provide an indication of the processing or service speed. Measures of
timeliness provide parameters for ‘how often’ or ‘within what time frame’ outputs are to be produced.
Equity measures
Measure how well a service is meeting the needs of particular groups that have special needs or difficulties in accessing
government services. For example, measures disaggregated by sex, disability status, ethnicity, income and so on. Equity
measures focus on any gap in performance between special needs groups and the general population.
Equity indicators may reflect equity of access – all Australians are expected to have appropriate access to services; and
equity to outcome – all Australians are expected to achieve appropriate outcomes from service use (Report on
Government Services, 2016, p 1.14)
Measuring benefits
Improvements, whether undertaken through formal projects or through continuous improvement processes, which are
carried out by an agency to increase its capability to deliver benefits, also create value for the agency.
Value for the agency can be measured by identifying the benefits expected from an initiative.
Benefits which demonstrate value for money in agency procurement could include: the advancement of the
Government’s objectives for the community; improved fitness for purpose; improved service quality; improved customer
support; reduction in whole-of-life costs; reduction in transaction costs associated with procurement; reduction in
holding, maintenance and/or disposal costs. Examples of other benefits include employee satisfaction and reduction in
processing time from re-engineering internal processes.
Suggested reference
Benefits Management
Building Queensland
Page 11 of 16
Monitoring performance
The purpose of monitoring results is to identify areas of good performance and areas where performance can be
improved.
Performance reviews
A performance review can be defined as ‘. . . a series of regular, periodic meetings during which the [executive leaders]
use data to discuss, examine and analyse, with the individual [unit director], past performance, future performance
objectives and overall performance strategies’ (Behn, R. 2006, p.332).
A consistent effort to improve the performance information of agencies reflects the fact that previously selected
measures are subject to change over time – agency objectives change, priorities change, different users emerge.
Performance information should be regularly reviewed and updated to reflect such changes in priorities and shifts in the
focus of public policy.
The continued appropriateness (including factors such as relevance, cost, value and usefulness) of performance
information should be regularly assessed.
As part of the continuous improvement of the PMF, each year, the Department of the Premier and Cabinet and
Queensland Treasury (the central agencies) work with agencies to review service areas, service standards and targets
that are published in the SDS.
Principles to assist agencies when reviewing performance information in the SDS are provided in PMF SDS: Performance
Statement Better Practice Guide.
Suggested reference
Good Practice Guide – Performance reviews
Queensland Audit Office
The value of performance information depends on how well performance information is aligned. To the extent
appropriate, responsibility for all objectives and services and associated performance information should be clearly
assigned to relevant agency business areas or position holders. This helps ensure a common understanding about
respective contributions to the delivery of services and the achievement of objectives by the relevant business areas of an
agency.
Page 12 of 16
Evaluation of policies and related services
It is good practice to regularly undertake evaluations of all agency policies and related services to ensure that value
created for customers, stakeholders and the community is being maximised.
Evaluation is the systematic, objective assessment of appropriateness, effectiveness and/or efficiency of a policy. A
commitment to rigorous evaluation is an important aspect of government accountability, especially in circumstances
where a policy is new (and the results cannot be reasonably foreseen as a result of prior research) and expensive. The
rigorous evaluation of policy initiatives also helps to build an evidence base that in turn can be used to inform the
development of future policies.
Better practice evaluation would include processes for the ongoing analysis and evaluation of performance information
and measures, including variance analysis of results and progress to date against targets and/or standards.
It is expected that agencies would have, processes for the continuing analysis and formal evaluation of a service area’s
activities, its measures and continued relevance to whole of government outcomes and priorities to enable continuous
improvement in service delivery.
The Queensland Government Program Evaluation Guidelines outline a set of broad principles to underpin the planning
and implementation of evaluations for programs funded by the Queensland Government.
In relation to preparing evaluations of assets, the FPMS (section 18(4)) requires accountable officers to have regard to the
Project Assessment Framework (PAF) and the Value for Money Framework (VfM). An evaluation is only mandated when
the accountable officer or statutory body considers the cost of acquiring, maintaining or improving a physical asset is
significant.
Suggested references
Queensland Government Program Evaluation Guidelines
Queensland Treasury
Project Assessment Framework Overview (incorporating the Value for Money Framework)
Queensland Treasury
Performance analysis
A useful tool to understand performance results is trend analysis. This presents data by showing how performance
changes over a period of time.
By contrast, variance analysis compares performance measures against each other from one period to another, from one
agency to another, or from target to actual. This type of analysis provides information about what drives the variances.
Page 13 of 16
Agencies are required to explain material variances between targets and their estimated and/or actual results in their
annual reports and SDS. What constitutes a ‘material’ variance is subjective and will depend on the particular measure
being analysed. A variance is generally considered to be ‘material’ if it is of such a nature that its disclosure would be
likely to influence decision-making by users of the information reported.
Benchmarking
Benchmarking involves the collection of performance information to undertake comparisons of performance. The three
main forms of benchmarking are:
• results benchmarking – comparing performance within and between organisations using measures of effectiveness
and efficiency
• process benchmarking – analysing systems, activities and tasks that turn inputs and outputs into outcomes
• setting better practice standards – establishing goals and standards to which organisations can aspire.
(Report on Government Services, 2016, pages 1.10-1.11)
Agencies proposing to conduct benchmarking should initially consider the legislative and/or policy differences of each
jurisdiction and how comparable the data is. There are numerous benchmarking resources available for agencies to use
including:
Page 14 of 16
Performance reporting
Mandatory public reporting requirements
Agencies are required to report on performance in a number of ways, with key mandatory requirements for public
reporting including:
Annual Reports — required to be tabled in the Legislative Assembly each financial year (Financial Accountability Act 2009,
section 63). Annual reporting must be undertaken in accordance with, and agencies must comply with, the following
policy document in preparing annual reports:
Budget Papers — in particular Budget Paper 5 – Service Delivery Statements (SDS). In preparing SDS Performance
Statements, agencies are required to comply with requirements outlined in:
Performance data alone does not generally tell the performance story.
There has been debate about the abundance of performance measures and the need to reduce them to a more
meaningful state. Thomas1 recommends that:
‘In the future, less emphasis should be placed on reporting data and more should be placed on allowing
program managers to tell the performance story’.
It is important to include contextual and explanatory information in reports, such as an analysis of performance
information, to communicate the meaning of the level of performance achieved and how it is to be interpreted.
Contextual and explanatory information may refer to:
• the rationale for the selection of performance information reported
• the significance of each performance indicator, service standard or other measure
• the environment in which the agency is operating (i.e. economic, social and environmental)
• external factors that may have impacted on performance
• whether performance is within acceptable tolerances if results:
o exceed expectations, are there any adjustments that need to be made
o are below expectations, are there compensating improvements in other areas, and/or higher priorities
o are not effective or performing poorly, does this need to trigger critical reflection and/or a change of approach.
1
Performance Measurement, Reporting, Obstacles and Accountability – Recent Trends and Future Directions, Dr Paul Thomas, ANU Press, 2006
Page 15 of 16
Better practice reporting principles
focus on the few critical aspects of performance
look forward as well as back
explain key risk considerations
explain key capacity considerations
explain other factors critical to performance
integrate financial and non-financial performance information
provide comparative information
present credible information, fairly interpreted
disclose the basis of reporting.
Source: Performance Measurement Reporting Obstacles and Accountability
– Recent Trends and Future Directions, Dr Paul Thomas, ANU Press, 2006
The Auditor-General notes that performance information should be balanced, addressing the agency’s key activities and
should report both the good and not so good achievements (Auditor-General’s Report to Parliament No. 4 for 2007 p.7).
Better practice performance reporting involves being open about the extent of, and reasons for, the results achieved –
whether the results are above or below the expected level of performance. It also includes explaining what the agency
plans to do in response, to the extent that the situation is within their control.
Page 16 of 16