3  Performance Measurement Lessons Learned

A partner company (www.BMB.ca) and PRS recently wrapped up a Performance Measurement Strategy (PMS) implementation project for a public sector internal services organization. We had a great client team, and I think we achieved the right mix of consulting support and dedicated internal resources to deliver a ‘win’.

The consultants brought some tools, templates and practical experience; the client team brought their organizational knowledge and genuine interest to embed performance measurement in management practice (I’ve heard this latter referred to as BAM – Brains and Motivation). 

Together we explored, adapted and implemented a pretty impressive and comprehensive PMS, which not only meets a need but will also look good on everybody’s CV ;-)

This one will make a good case study at some point, but in the meantime here are a few real-world performance measurement lessons learned as documented by our team.

Lesson 1 – Keep good books.  Over several months, we took this project from kickoff to logic model development to indicator definition and feasibility testing to report definitions to data collection and analysis to production of a full Annual Performance Report (whew!).

You can bet we had multiple versions of many documents, and that we often had team members working in parallel on parts of the project.  I can’t overemphasize the need to maintain a central repository with excellent version control on all documents. 

All our documents had version dates, and we kept a ‘drafts’ sub-folder within each individual folder that contained versions of our Logic Model, indicator definitions, indicator reports, performance measurement matrices, and the annual performance measurement report itself. We set it up so the latest version of each document was the one you first saw when opening a folder.

We also kept all supporting and supplementary documentation (e.g. source data reports) in a folder associated with each indicator. This is critical to enable your performance measurement coordinator to reproduce results accurately and efficiently.

Lesson 2 – It’s gonna cost you. I have mentioned this in other newsletters, but it bears repeating; building and implementing a real performance measurement strategy that provides useful information to your management team and stakeholders will cost you time and (probably) money. About a hundred days of consulting support went into this project, and probably double that in internal team time.

We had an enlightened Director and management team that understood this and were willing to commit this level of resources, as well as their own attention. All parties seem to agree this paid off, but understand that like with most things, you pretty much get what you pay for.

A little side lesson, we kept track of how long it actually took to collect, analyze and report on performance data, once all the preliminary development work had been done. It came to about 200 days to collect and analyze data on 21 indicators and produce quarterly and annual performance reports. In other words, about 1 FTE.

I expect this will get better with experience, and information technology can certainly help, but the lesson abides.

By the way, this also makes the case for keeping the number of indicators under control; remember you get nothing for nothing.

Lesson 3 – Have a plan, but yield to reality. I’m a consultant and therefore am constitutionally inclined to GANTT charts, but the real world has a way of rearing its head in any project. This is nowhere more apparent than in performance measurement. This means you absolutely should set milestones and allocate resources to achieve project tasks, but don’t expect the path to be smooth or straight.

There is an undeniable learning curve involved in building and implementing a real Performance Measurement Strategy; this means there will be some ambiguity and uncertainty in the beginning, and not every effort will produce an end result.

Following a proven methodology can help, as will good project management and scope management (see level of effort from Lesson 2).  But expect that, as the team learns more, initial assumptions will change. There will be more revisions and ‘do-overs’ than you expect as you adapt to the particular needs of your organization. That’s OK, real performance measurement is an on-going organizational learning exercise. But sometimes you gotta go with the flow.


Return to Home page from Performance Measurement Lessons Learned



If you're a manager in the public sector and would like to discuss how PRS can help with your performance measurement challenges, contact PRS Vice President Charlie Snelling csnelling@rogers.com 613 744-4084 or me, Scott Kelland scott@public-sector-performance.com 613 302-3924



Like to add your own perspective on Performance Measurement Lessons Learned? Use the form below to comment.

Question or comment about this article?

If you would like to ask a question or make a comment, here's the spot.

[ ? ]

Upload 1-4 Pictures or Graphics (optional)[ ? ]

 

Click here to upload more images (optional)

Author Information (optional)

To receive credit as the author, enter your information below.

(first or full name)

(e.g., City, State, Country)

Submit Your Contribution

 submission guidelines.


(You can preview and edit on the next page)

PracticalPRS

PracticalPRS Dashboard

Looking for a simple, easy-to- implement IT solution to Performance Reporting for your Public Sector organization?


PracticalPRS is our Cloud-based data collection, analysis and reporting tool. Click on the Dashboard to see a video demo.


Subscribe to receive our free guide to Performance Measurement for the Public Sector.

Enter Your E-mail Address
Enter Your First Name (optional)
Then

Don't worry — your e-mail address is totally secure.
I promise to use it only to send you Public Sector Performance Journal.

Best of PRS

The most-read articles from PRS over the lasr 30 days


Measuring Internal Services FAIL


Outsourcing Performance Measurement


Performance Measurement Best Practices