What you are asking for is hard to find as most companies don't share metrics outside of the organization. Unfortunately, the idea of openness in the social media world doesn't translate to benchmarking studies. There are some metrics out there published by organizations offering communities, but not too many that actually cover actual results over time.
Of course, for those of us in the consulting business these metrics are also very dear to our hearts as they are one of the value propositions we bring into the engagement...
Most of the benchmarking now is coming from private groups like our online community roundtable. We do share metrics among members, but there are no competitors of members in the group and we agree not to publish anything without permission.
I would agree with Mike's statements about community administrators' reluctance to share benchmarking statistics. It is a problem that I've thought about a good amount. I think if a particular benchmark measurement can be defined in a very concise way that there are in many cases ways to find the information online or to compile it in automated ways. One very good discussion of different benchmarks for online communities is the report available at Benchmarks for Building Extranets and Online Communities. What are the specific benchmark measurements that are most important to you? I might have some advice on how to search for the results.
In a discussion with another Clearstep user, Ian Clayton, he brought a really interesting resource to my attention called KPI Library(Key Performance Indicator Library). They have the lofty goal of defining, documenting and sharing KPIs between peers and across industries. I can't vouch for the quality of what they've achieved, but it looks like it is a very large community of people that is sharing their data. I'm going to continue to look into the service.
Here is a brochure on a benchmark study (see the last page)
Here are the mp3's
Excellent, thanks John will take a look at those resources.
I was browsing around KnowledgeBoard today, and found a couple more, actually I'm mistaken these are more measuring docs, not benchmarks, Oh well I'll share them still the same
"CoP influence on productivity depends on the frequency of exceptions and their type: they need to be recorded, and baselined (we need to know the standard pre-CoP cost of dealing with them), if we are to measure CoP financial impact".
Thanks to everyone for offering up these resources. Most interesting.
I am historically skeptical of industry standards and averages. I remain unconvinced that the vast differences among companies, industries, measurement practices and even definition of terms can truly be overcome well enough to make any standards or averages meaningful. However, I see great value in reading survey results, case studies, and any kind of research that explains the practices behind the results. That's where I seek new ideas and insights, so I welcome any leads on research.
I'm in charge of an internal deployment of Clearspace (we're six months in) and I'm trying to determine what metrics to establish to measure our success (and areas that need improvement). Seeing the questions asked in research studies is most helpful to me in this, as it gives me ideas for potential metrics we could adopt.
I also expect that whatever metrics we adopt this year, they will need to evolve and develop over time. This will make it harder to trend the progress of our metrics over time, of course. But it seems to me unlikely that we can know today what metrics will best measure our usage a year or two from now. I have to believe that with time and experience we will learn more about how we are using social media, and so will come up with better ways to measure it.
Wow Ted that was so brilliantly said, that I might have to quote you in a blog post I'm drafting on ROI...if that's OK.
Dave Snowden often talks about the danger of the recipe approach of deploying things like software, processes, strategy, etc...
- one thing that works successfully in one context, doesn't mean it will work in another context (due to complexity...too many different variables at play)
And you point out that the same is for measuring value
- what works for one doesn't mean it will work for the others
- a benchmark set of numbers just doesn't it cut it, as every situation is different
Dave Snowden has a couple of links on these research methods
Anecdotes are much more richer, and contextual, and come from your playground, as a measure of value...rather than an arbitrary industry number (benchmark)
I like how you say the richness of studies (as you say " the practices behind the results") give you ideas that may cross over once re-framed into your situation
You also bring up a good point that the criteria or measurements you use may organically vary along the lifespan of what you are measuring.
As you allude to, your KPI's need to be suited to your context, and need to adapt to how your community grows.
Also important is that it's not all about outcomes, KPIs are not going to measure emergence, as it's doesn't know what's going to emerge.
- so this is where anecdotes are really handy, as they give us to answers to questions we didn't ask (think of)
I see this thread dealing with *benchmarks*, which is a bit different than ROI. It's a place to aim for, based on some industry average that may not really apply to your situation. But ROI is the same, you got to use your own KPI's not something you bought in a packet.
I just thought of something else. KPI's in there outset steer you to perform in a certain way to reach a target...but we must remember that this "focusing of efforts" or "planning" approach should be not 100% of your method. You also need a relaxed approach to let things emerge. What I'm trying to say if we plan to act a certain way, we forgo letting any gifts of emergence have the chance to surface.
Sometimes when we don't plan things, it means we are not acting a certain way to reach a target, and when this happens it allows for things to emerge as we are open to them being able to happen.
Plus with a 100% targeted plan approach we may use all efforts to reach the target, rather than doing more valuable.
I quote Venkat's post on planning and outcomes in one of my blog posts
Thanks, like the points about emergence and KPIs. Dave Snowden takes me back to knowledge mgmt & complexity work for the UK NHS! In fact now I recall that the NHS Modernisation Agency reported on the use of 'social movement' models for managing complexity as there's material from that social science whuch relates to the value of networks & emergence.
I'd be flattered to be quoted, John
Enjoying this discussion and hoping for more people to join in with other ideas.
Ted - I realize that a lot of water has gone under the bridge since this thread was last visited but I'm wondering: Now that you have 18 months of experience with your deployment what have you been looking at to measure it's success?
This is a FANTASTIC thread. Seriously! Copying Jennifer Erzen here, as well.
Question for everyone here. Are there any thoughts on "average time on site"? We have roughly a 9 minute average, and I am trying to work out if that is good or bad. I know that will depend on what we have on the site, and how it's used internally. We use it as a social intranet, but more of an opportunistic one. A lot of core intranet function lives at each of our business units...but our site is the glue that binds all businesses togehter. So, we have a lot of strategic content (video included), groups around topics of interest, etc.
Any general thoughts on this topic would be much appreciated. And, thanks again to everyone here for the tremendous links, postings, and insight! +1 :-)
I am curious when are we going to get anything that could be applicable to our business on monthly basis? I have been asking for it for while knowing that we are all looking for it. Hopefully, someone at Jive provides guidance and data? Or perhaps it's worth organizing a community that shares specific data applicable to all and develop benchmarks?