17 Replies Latest reply on Aug 6, 2015 3:56 PM by Ted Hopton

    How do we investigate and act on Engagement Index results?

    Ted Hopton

      I've been an advocate for actionable analytics for a long time, and I have been pleased to see and hear Jive emphasizing the term when discussing their objectives for reporting enhancements. I'd like some help in understanding how to act on the new Engagement Index report. I have some concerns. For convenient reference, I have bolded my questions.

       

      Here is what I see for my company today, June 29 2015:

      engagement index june 2015 example.png

      WTF Happened in May?

      My first concern is how to explain the huge drop in engagement in May. If I publish this report and share it with my stakeholders, that's certainly the first thing they will ask. And I have no idea how to answer them. Help, Jive! You created this report, now teach us how to use it. How can I find out what changed in May?

       

      Is the Report Correct?

      Not to put to fine a point on it, but I've had problems with the integrity of the data on Jive's reports before, so the first thing I'd like is a way to validate that the report is accurate. I'd like to drill down on this report by business unit or location, to try to see if there's any noticeable anomaly, but there's no filtering option. (While I'm at it, I'd like to be able to select the time range and so not have those empty months show on the report, but for some reason only the end date can be modified, not the start date.)

       

      So, I looked for another report to help me understand this one and the best I could see was the Daily Activity report for the same time period:

      daily activity june 2015 example.png

      I'd have expected to see a similar drop off in daily activity in May, but there is no such aberration. Views are by far the dominant activity, so if they aren't changed, then why has the Engagement Index changed so drastically? And then there's another oddity: when we upgraded to Jive Cloud we had a huge spike in activity as people checked out all the new features (they hadn't seen an upgrade in years...). So, why is the Engagement Index really low in November, too? (Maybe it only has data from November 8 forward, since we upgraded on November 7?)

       

      Here's what the Daily Activity looks like without Views, since it's starting to look like maybe they aren't part of the Engagement Index:

      daily activity june 2015 no views.png

      I sure don't see anything conclusive here that helps me really explain the Engagement Index results. Maybe May is a bit lower in this report, but nothing drastically lower the way the Engagement Index shows.

       

      So if I can't find any corresponding change in activity levels, how about on the user side? I looked at the User Adoption chart:

      user adoption example june 2015.png

      Since the active, participating and contributing users are measured over the previous 30 days, I included all of June that was available, just to be sure there wasn't some drastic change that took place in May. Nothing jumps out at me from this data to explain why May on the Engagement Index is so low.

       

      What Else Can I Look At?

      I guess I'm going to have to dig into the Jive DES to try to find some answers. Any suggestions on where to start with the Jive DES data? I guess the first step will be to see if I can calculate the Engagement Index myself from that data and see if it matches what this report displays. Assuming that I find the report is accurate, once I have built my own calculation, then I should be able to drill down into the data to try to see where the changes are taking place. Something big seems to have happened somewhere...

       

      Any suggestions? Hoping for help from analytics stars such as Udit Shah, Dirk McNealy or Claire Flanagan.

       

      Looking in Rear View Mirror vs. Looking at the Road Ahead through the Windshield

      Oh, one more thing... it would be really useful to get an idea how June's numbers look, but that's not possible with this report. If I select a date past June 1 but before the end of June (because June has not ended), then the report shows results through May. Rear-view mirrors are useful for certain tasks, but it's also really helpful to get a look ahead at where the numbers are going so you can take corrective action as needed (back to that actionable bit again). Is there a way the report could let us see what's developing during the month?

        • Re: How do we investigate and act on Engagement Index results?
          cflanagan17

          Hi, thank you for posting. I know you at mentioned Udit who might be able to help (and I'll see what I can do behind the scenes to rally answers). In the meantime, did you log a ticket in your support area?

          • Re: How do we investigate and act on Engagement Index results?

            Ted Hopton, would it be possible for you to open a ticket with support. I will also discuss this with the team internally, but these issues can be addressed quickly with the help of support.

            • Re: How do we investigate and act on Engagement Index results?
              Dennis Pearce

              One feature I would like to see across the board in CMR is the ability to view data year-on-year.  Today I export the data so that I can create the charts I want in Excel.  Year-on-year is helpful because adoption is often a slow, gradual process in large companies and it's hard to see progress or deterioration when the data are presented in one continuous line.  So for example once I got the new engagement data I created this chart:

              engagement.png

              I know that in early 2013 we were just a few months into using Jive so there was a smaller base of users, most of who were busy setting things up which is why the line is higher at that time.  I can see that there is an understandable seasonal dip at the end of the year.  I am also pleased to see we have been making slow, steady progress over the years.  This is the kind of trend that is very hard to spot when viewing a continuous line, especially with only a year's worth at a time.

               

              I am also glad to see that this engagement metric is a ratio based on the number of users, and I hope that's a trend for future metrics.  I have been re-graphing the active, participating, and contributing data as percentages since we started with Jive.  We have acquired 12 companies in the last few years while at the same time divesting ourselves of a division and offering an incentivized headcount reduction that stretched over the course of a year.  So our employee base has bounced all over the place and just looking at absolute numbers doesn't tell me much at all.

              1 person found this helpful
              • Re: How do we investigate and act on Engagement Index results?
                Ted Hopton

                Update on this issue, and a request for cooperation from other Jive Cloud customers

                 

                Jive Support has been investigating. Here is some insight from that case:

                The two primary contributors that I saw to your drop in Engagement Index were Document Downloads and User Modifications.  Downloads on Documents dropped from about 1 million per month for Jan through April down to around 40 thousand per month for May on.  Modifications for Users dropped from about 1.1 million per month for Jan through April down to less than 10 thousand per month for May onwards.

                My response:

                I still have trouble accepting that my community's users suddenly and consistently changed behavior so drastically. The only time I have ever seen such drastic changes in the level of reported activity (going back to 2008) is when Jive changed the way data was collected. IOW, I strongly suspect that something happened in May that affects the way Jive collects or classifies this data, particularly the two items you highlighted.

                 

                Do other customers see a similar change in Engagement Activity results?

                For Document Downloads, that's a 96% decrease. For Modifications for Users it's a 99% decrease. From one month to the next. And then sustained at the new low levels since then. Sure, I expect some seasonal slow down in activity in the summer months, but May is not a peak vacation month and never would I expect such massive shifts in user behavior.

                 

                Let's be realistic, too: we have 5000-6000 active users. They didn't really download 1 million documents per month, ever. Rather, Jive classified some database activity as "Document Downloads" that is not what normal people call downloading a document. I know, for example, that every time an image gets called to be displayed on a page that is housed in a Jive document, it's been logged as "document download" by Jive (at least it was in the past). We have Social Edge Consulting's MOSAIC widget on our home page, and it alone racks up at least a half dozen "downloads" every time anyone loads our home page.

                 

                Udit Shah, has anything changed in the way Jive classifies activities? Dirk McNealy, do you have any insight on this?

                 

                Fellow Jive Cloud customers, would you please pull your Engagement Index report for all of 2015 and see if it looks like ours?

                  • Re: How do we investigate and act on Engagement Index results?

                    I agree that it is very unlikely that your users have suddenly stopped doing 99% of their activities. It is our intention that for cloud analytics if the same activities happen, then at least the same events should be produced, with more data version over version if we add to the events or add events. I can't think of a change off the top of my head that has gone out that should have caused this, but it is very possible that one has. Ideally your jive support person can file an issue with the analytics dev team and they can look into your case in more detail and give you a more definitive answer.

                      • Re: How do we investigate and act on Engagement Index results?
                        cflanagan17

                        Dirk McNealy Just so you know, Ted has filed a support case. There is an internal discussion on this I'll at mention you on for review!

                         

                        Ted Hopton hang tight. As a former customer I feel your pain and your question! And of course you know metrics is near and dear to my heart. I'm trying to rally some folks internally to dig in a little deeper!

                          • Re: How do we investigate and act on Engagement Index results?
                            Ted Hopton

                            Thanks, as always, Claire, for your support and attention to stuff like this. I know you care deeply and it's appreciated.

                             

                            Thanks for chiming in, too, Dirk. It makes me feel better hearing you think something must have changed, too. Now we just have to find out what it was, why, and what to do about it. Frankly, if the change results in more accurate reporting (e.g., counting only true document downloads as document downloads), then I can live with the change and explain it. I just can't explain to my stakeholders that I "suspect" there was a change -- I need a clear explanation from Jive for such drastic changes in the numbers. And if it's affecting all Jive Cloud customers, they'll need it, too, once they start looking closely at the Engagement Index report.

                        • Re: How do we investigate and act on Engagement Index results?
                          Stephanie Standring

                          We also have a weird up and down in our user engagement index that doesn't make sense to me. In October 2014 we see a big spike, the biggest out of all the months to date, but our User Adoption and participation is the highest it has been. So I don't understand why the huge spike in October.

                           

                          Is this graph relative to the data/numbers for that time? Is that why it looks larger than current month?

                        • Re: How do we investigate and act on Engagement Index results?
                          Ted Hopton

                          Here's an update on this issue from Daniel Harada in Jive Support:

                          Another quick update - It looks like the Modify User activity was the dominating factor for your drop in Engagement Index.  Our Analytics team calculated your engagement index for the past 6 months with the Modify User activity removed entirely, May and June look comparable to other months in that view.


                          We are still working to find what may have caused this shift in the Modify User activity.  In our internal Jive instance we see a similar shift in this particular activity, but starting in June instead of May.  I will let you know once I have more information on what could have caused this shift.

                          And then, this update which sounds promising to me:

                          After further discussion, it has been decided that the User Modify event is not an event that we wish to track for the Engagement Index, as it is unlikely to be related to actual user engagement.  A bug has been filed to remove this item from the Engagement Index calculation, JIVE-60554.  I don't currently have an ETA for when this bug will be addressed.  As I mentioned above, this one activity seems to have been the driving factor for your shift in Engagement Index.

                          Sounds like a satisfactory resolution to me and I look forward to seeing this bug resolved so the Engagement Index will become meaningful (I hope).

                           

                          Nonetheless, I still think there is a need to provide more ways for us to dig into, and therefore understand and explain, the data underlying the Engagement Index report. That need remains outstanding.

                          1 person found this helpful