Shared Services Performance Measurement – Part Deux

In a previous article about Shared Services in the public sector, I talked about the need to establish a performance baseline for the Shared Services organization, and that effective Shared Services performance measurement requires a plan, a budget and dedicated resources to make it happen.

In this article I'm going to revisit the topic, and add a few more observations about the possible foibles and pitfalls of measuring performance for public sector Shared Services.

Let me start by identifying a couple of key differentiators of Shared Services versus a traditional 'centralized' service delivery model. While both models have implicit the goal of efficiency of service delivery (and therefore cost savings), I believe the single biggest difference lies in the approach and attitude towards consumers of the services.

Shared Services typically have 'customers' or 'clientele'; centralized service delivery organizations have 'end users'. This means activities like Customer Relationship Management and establishing Service Level Agreements are much more important to the Shared Services organization.

Customer Relationship Management involves providing many venues for two-way customer engagements e.g. Customer Boards or Customer Advisory Groups – and multiple channels for communication e.g. service desks, dedicated client relationship managers, issue tracking systems, etc.

Service Level Agreements define the responsibility (shared between the service provider office and the customer) for service delivery

This means that a part of the 'performance story' for Shared Services centre should include your degree of success with these tasks.

And therefore the way ahead for Shared Services performance measurement includes:

1 – identifying outcomes that are meaningful to your customers;

2 – identifying the activities and outputs that need to 'go right' to achieve these outcomes;

3 – documenting the venues and channels for customer engagement and communication;

4 – establishing practical measures for all of these.

This shouldn't be difficult; by the nature of the work performed, Shared Services organizations produce a lot of administrative data. Early stages of Shared Services performance measurement should focus on the 'art of the possible' i.e. with available data and easily-accessible data sources.

Final point: don't be in a rush to establish targets. Setting targets can be tricky; it might be better, at least initially, to focus on improvement over time.


If you would like help in defining your performance story, or with any aspect to building a practical performance measurement system for your organization, contact PRS Vice President Charlie Snelling 613 744-4084 Charlie@public-sector-performance.com


Question or comment about this article?

If you would like to ask a question or make a comment, here's the spot.

[ ? ]

Upload 1-4 Pictures or Graphics (optional)[ ? ]

 

Click here to upload more images (optional)

Author Information (optional)

To receive credit as the author, enter your information below.

(first or full name)

(e.g., City, State, Country)

Submit Your Contribution

 submission guidelines.


(You can preview and edit on the next page)

Return to Home page from Shared Services Performance Measurement

Managers at all levels are challenged by MRRS and reporting to Central Agencies. PRS practical methodology and technology can help.

PracticalPRS is our Cloud-based data collection, analysis and reporting tool. Click on the Dashboard below to see a video demo.

Subscribe to receive our free guide to Performance Measurement for the Public Sector.

Enter Your E-mail Address
Enter Your First Name (optional)
Then

Don't worry — your e-mail address is totally secure.
I promise to use it only to send you Public Sector Performance Journal.