21 Replies Latest reply on Oct 12, 2012 1:01 PM by bchamberlain

    Metrics and Analytics

      I was wondering how everyone else is tackling this problem:

      Jive's out-of-the-box analytics are not that great. I am struggling to find out how well our community is performing.


      How does Jive assess an "active user"? How do I know that 50% of my "active users" are not employees? How do you guys drill down into getting more detailed metrics and a greater understanding of what's happening on your communities?


      We use Google Analytics, but besides showing us how many people hit how many pages, we don't get a lot more out of it

        • Re: Metrics and Analytics


          Yes, we are struggling to get good numbers.  We have a user property that identified internal and external folks.  Using the analytic module or having someone that can access the database dump to pull the data gets you close.  If the community is like a web page where read only is the measure of success than Google Analytics and Omniture are OK.  If it is a real community than you are interested in activity.  The Jive definition of an active user is too  broad for me (they count login and view as activities).  I have narrowed down the report to look at people that create replies.  Looking at status points shows people that also are asking a lot of questions so that is not a good gauge for me.



            • Re: Metrics and Analytics

              I tie a lot of Google Analytics metrics into my KPIs namely:


              Unique Visitors
              % New Visits
              Pageviews per  visit
              Bounce Rate
              Time on Site
              User Creation (from Jive)
              % Returning Visits



              A common question from management is "What does this mean?" and "What exactly are you trying to measure?"

              I came up with what I think is a nice way to create a framework around these numbers. In '12 we will measure information on 3 levels:


              Attraction - The ability to attract an initial audience.

              Attention - The abiltiy to 'reel them in' and have them go deeper into community content

              Adoption - the ability to 'convert' them into Community users have them contribute to discussions.


              When placing the aforementioned metrics into the 3 framework categories above there is a clearer understanding of what we are actually measuring:


              AttractionUnique Visitors
              Attraction% New Visits
              AttentionPageviews per visit
              AttentionBounce Rate
              AttentionTime on Site
              AdoptionUser Creation
              Adoption% Returning Visits
              AdoptionInteractivity Rate


              Hope this helps.

            • Re: Metrics and Analytics

              This is a bit of a shameless plug, but we use our QlikView software to import data from both the analytics and application database and perform the analysis in there. We have actually "modified" Jive's definition of what active and passive means; instead we mapped activities which we consider "active" - posting, commenting, voting, etc. - and created a new field based on that information. We are currently tweaking our version of the app and hope to have a public demo out sometime in the new year.


              We showed an early version of this app at JiveWorld 11; if you have access to the session video you can watch it there.

                • Re: Metrics and Analytics

                  Possible to get a trial version of this for 4.5.6?

                  • Re: Metrics and Analytics

                    I bet Bill Chamberlain would have something to say about the effectiveness of Qlikview and the analytics framework he's helped build. He demoed it for us a few weeks ago and it was very impressive.

                    • Re: Metrics and Analytics

                      Hey Jason -- I realize you posted this month's ago, but Jonathan's note caught my eye as did the mention of QV. We are big consumers of QV at Mentor, and I believe you have worked with one of my colleagues here at Mentor. I would definitely like to better understand how we can bring any Jive data into QV, tweak to our needs, and better drive to good conclusions. Kim is going to contact you next week. My situation is much like some of the earlier posters to this thread in usine GA. For instance, would like to use GA to record a log in, but not sure I have the tech chops to implement. And can't get the time of more technical people to help.

                    • Re: Metrics and Analytics


                      There are great options around such as Qlikview as Jason mentions.  It would be great to hear about your experiences with such tools.


                      I also want to plug a new Jive plugin (ha!) called Community Manager Reports (CMR) that will be released within a couple weeks. 


                      COMMUNITY MANAGER REPORTS

                           Who is it for?      Community Managers and Group or Space owners

                      What does it do?     Integrates easy-to-use charts into the Jive experience that show how communities are growing and behaving

                      How do I get it?      Free add-on to Jive 4.5.7 or 5.0.1


                      Target release date: December 14, 2011.  (See Community Manager Reports release timing and supported version information)


                      As this rolls out, I will be very interested to get info from you guys on what does and does not meet your needs for managing external communities so we can fill key gaps as quickly as possible.  To the original question of how to define Active User, here is how it is done in CMR:

                      A user is considered Active on a particular day if he or she has done anything (including a View) within the previous 30 days.  This applies at the level of the whole community and also at the level of Groups and Spaces.  So a user is Active in Group Foo if he or she has viewed something in Foo withing the last 30 days.  We also define a Participating User to be someone who has done something beyond a View -- e.g. add a Comment or Like something -- within the last 30 days.  And we define a Contributing User as someone who has created new content over the last 30 days.   The theory behind these definitions is to capture the level of RISK that a user is willing to take -- creating a new Discussion is putting yourself out a lot more than just reading stuff. 


                      Here is an example of the kind of charts we will provide (this chart shows user adoption in a Group):




                      Another thing that might be useful for you (but, again, I look forward to hearing about your experiences) is that CMR supports showing only Users that match a filter that can be defined using profile fields configured to be Searchable in your environments.  


                      In addition to User Adoption charts, CMR shows stuff like Content over time, Answered Questions, profile completion, and top Places.



                        • Re: Metrics and Analytics

                          Will community management reports be able to address any of the following:


                          1. Trending topics over a customizable period of time (by tag for example): e.g. so we could make observations such as "discussions on content tagged 'Iran' died down in March but picked back up in April?"
                            1. If this is possible then can it also be segmented via profile fields?
                          2. Content by category breakdown over time (e.g. If I have categories setup to represent statuses: e.g. Open, Closed, Dismissed. I'd like to be able to see how many pieces of SBS content over a given period of time were open, closed, or dismissed)
                            • Re: Metrics and Analytics

                              On the first item, the short answer is "No.  Not in the first version."   This is a good item for our backlog.   One thing that is already on the radar is trending on search query terms.  My impression is that tagging can be pretty hit or miss but search queries are a pretty dependable measure of community interest in a particular topic.  But this is certainly debatable.  Would be interested to hear your impressions of what are reliable indicators of interest.  To the extent that you can separate topics by Group or Space, then you can get what you are looking for, including restricting to users matching a query.  (Though I realize that you would like to get more fine grained than that.)   


                              On your second question: we are not currently reporting anything regarding Categories.  The one thing that we do offer in terms of breaking down content by state regards Questions.  We show all Questions over time (globally or per Group/Space) and how many of them have a Response, how many have a Helpful Response and how many have a Correct Answer.  In addition, we give indications of the average length of time questions were open before being Responded to, got a Helpful Answer and a Correct Answer.   The Question charts can be filtered by the person asking  the question.  For example, I could look at Questions over time in a Group that were asked by people in Palo Alto. 

                                • Re: Metrics and Analytics


                                  We were interested in looking at "the average length of time questions were open before being Responded" as well as tracking the time for each question.  How are you reporting response time?  We have also found that, just as depending on tags is risky, depending on people returning and marking answers as correct or helpful is not dependable.  We are looking at the questions with no replies as a better indicator that questions are being answered.



                                    • Re: Metrics and Analytics

                                      Here is how we deal with Questions in Community Manager Reports:


                                      First, we track the total number of questions asked (either total in the whole community or within a Group or a Space) and then we track three subsets of these Questions: Questions with at least one Response, Questions with at least one Helpful response, Questions with a Correct response.   We then graph these values as trend lines over time.  


                                      We give an indication of response times with the following scheme: on a given day, some number of questions will get their first response.  We calculate the average length of time between when those questions were asked and got their first response.  Similarly, for all questions that got their first Helpful answer on that day, we calculate the average length of time between when they were asked and got that Helpful answer.  And the same for Correct answers. 


                                      Screen shot 2011-12-20 at 1.07.21 PM.png

                                      I should mention that these average response times are in days (rounded up) but if you export the CSV you get the averages in hours.


                                      A couple notes on the charts:  You can remove any of the lines that you don't want -- so you could focus only on Questions and Questions with Responses if you don't care about Helpful or Correct answers.  Also, you can zoom in on date ranges either visually within a chart or by selecting a different date range (last 7 days, 30 days, 90 days or 1 year).  And, most interestingly, you could add a filter that limits the charts to only questions asked by people matching the filter.

                                        • Re: Metrics and Analytics

                                          I had a couple of suggestions that I think would be beneficial.  Hopefully they can make it on the road map.


                                          1 - Add questions with no responses.  Everything else is there except for that and it is something we are likely going to report on.  Sure we can do some math to figure it out, but if everything else is already there it would be nice if that were included too.


                                          2 - Add the ability to see questions answered by a particlar group.  We have an employee group as I am sure many external communities do and one thing we are going to try and report up the chain is how many questions got answered by non-employees to show the value of users helping other users and not taxing our resources. 


                                          I am stoked about this.  It looks great so far.  Thanks for the work on it Karl.

                                          • Re: Metrics and Analytics

                                            Why isn't the average response time tracked cumulatively over time, as the other metrics are? Having this "on this day" measure isn't really that helpful as replies and answers are activities that happen seldom enough on a given day that the samples are really too small to be meaningful and using averages means that the daily scores can be radically skewed by outliers.


                                            Any plans to change this schema going forward?





                                              • Re: Metrics and Analytics

                                                I have also experienced the problem of outliers, so I appreciate your concern.   What cumulative measure to you think would be useful?

                                                Regarding our plans: we don't have plans to rework this chart right now but getting precise about what changes would add a lot of value would help us figure out when it makes sense to do that.



                                                  • Re: Metrics and Analytics

                                                    Most of the other reports seem to take "AS of this day" approach (e.g. As of March 20th, there were 600 questions, 300 of which had replies, and 200 of which had correct answers).


                                                    So, it would be great to see that same thing applied to avg response time (As of March 20th, the avg response time for all of the questions in the system on that date was X days to a reply, X days to correct answer, X days to a helpful answer).


                                                    That number could also then be compared against the number of questions in this system that still have NO reply or correct answer, as an low avg response time is fairly misleading if you a have large percentage of questions that still have no first reply at all.

                                                      • Re: Metrics and Analytics

                                                        Hi Karl,


                                                        Wondering if there is any progress on getting a better CMR chart to understand time to first response. This chart seems very misleading, Is there a way to get the system to recalculate at a different level of granularity (e.g. what is the avg response time for a given month, not for each day in a given month).


                                                        I also think that the outliers just really make averaging problematic regardless. Anyway we could get a scatter plot chart that would show where the response times cluster?





                                                          • Re: Metrics and Analytics

                                                            Hi Meg,


                                                            Thanks for ping on this.  I agree with you that this report should be improved.  Doing so is most definitely on our roadmap.   But to be completely transparent, at the moment there are two new reports that I am pushing the team to get done that are taking the priority.  I am hopeful that these will be useful and worth the wait on fixing up the question latency report and other important things.  Specifically, we are prioritizing a content leaderboard and a daily activity chart.  The first is kinda similar to the places leaderboard but instead of giving the most active places over a time interval it gives the most active content both globally and in a place.  The daily activity chart will give an easy way to measure the ups and downs of activity (in a place or globall) on a daily basis. 


                                                            I hope to catch up with you at Jiveworld where I plan to demo the new charts and also dig deeper into your suggestions for the question latency chart and other things.





                                                            • Re: Metrics and Analytics

                                                              With the QlikView framework we built (that sits on top of Jive's data warehouse), we can probably address the questions from both Matt Nevill and Meg Gordon With our framework, you can get to the most granular level of detail that you want. e.g. Who looked at a particular question on a particular day and what did they do with it?


                                                              The snippet below gives you a simple view of one of our dashboards....this is showing data from Jul-Sep 2012. You can then drill into ANYTHING on this screen and get to a more granular level of detail.(Document, an Activty, Group, Space, etc.) The data view changes immediately. Happy to chat further with anyone that is interested.


                                                              (Please note -- the data is scrambled for privacy reasons.)


                                                              qv snapshot.jpg


                                                              Jonathan Tzeng thaniks for looping me in!