A few months back when I talked about the redesign of this site, I mentioned some goals I had and how they might be measured through analytics. I meant to share some of those statistics before now, but now it is.
With some goals it’s a subjective call whether or not the design succeeds. Others can be measured. In my post about setting goals for the redesign I mentioned 5 specific metrics that might offer insight into the effect of the new design.
- subscriber count
- time spent on page/site
- bounce rate
- unique visitors or visits
I want to focus this post on the middle 3 items above, but first I have a couple of general thoughts.
Subscribers and traffic are each up about 20% since I launched the redesign, but I can’t say it’s mostly due to the new design. Both are always increasing and probably more because of marketing than anything else. I would imagine the new design contributes, but I don’t have a good way to determine what effect it had.
Something I’ve also noticed is traffic from mobile devices has seen an incredible growth. Comparing the month prior to the redesign and the month after, mobile traffic increased over 300%. How much of that relates to a responsive design and how much is simply the ever increasing use of mobile devices I don’t know. For the increase to happen over such a short time frame leads me to think the responsiveness of the site contributes significantly.
A Word About the Stats
I’ve looked over analytics data for the site in a number of ways. Here I want to share stats from a couple of different time frames.
The first time frame is centered around the day I launched the redesign and it compares July 30th through Aug 29th (after) with June 25th through July 25th (before). The dates are adjusted to make a Monday to Monday and Tuesday to Tuesday comparison.
Second is a year over year comparison between November 24th to December 24th of this year and November 26th through December 26th of last year. The two day difference is again so Monday would align with Monday, Tuesday with Tuesday, etc., which I think makes for a more accurate comparison.
Before and After Redesign
Even though the numbers below are missing a few days prior to and on the day of the launch they offer some insight into the immediate impact of the redesign.
However, there was a glitch in that Analytics was recording thousands of visits from me as 0.00s visits. I never quite understood why, but after a few weeks I figured out how to prevent counting those “visits.” It only affected direct traffic numbers, which you’ll see aren’t aligned with the rest of the numbers.
|Jul 30, 2012 – Aug 29, 2012 vs Jun 25, 2012 – Jul 25, 2012|
|Pages/Visit||Bounce Rate||Visit Duration|
If you look at the change in overall traffic numbers there’s improvement in all three metrics. Visit duration and pages/visit are up, while bounce rate is down. The improvement is minimal in pages/visit and bounce rate, but it’s there The average visit duration improved significantly.
The numbers get better I think when we dig a little deeper.
Again the numbers for direct traffic can’t be trusted since they were heavily influenced by the bug in reporting. Until I fixed things Analytics was showing about 65% of all direct traffic as these 0.0s and 100% bounce rate visits by me.
I wasn’t expecting to see large changes with search traffic, given that it’s normal for a lot of search traffic to bounce right away, but still you can see all three metrics show improvement, with the visit duration improving significantly.
The referral numbers meant more to me, because they come from other sites recommending content here and represent visitors who are more likely to stick around and subscribe. Here the numbers are better than the overall across the board. Even the pages/visit and the bounce rate improvements are becoming more significant.
Year Over Year
For a longer range impact I compared the most recent month this year with a similar time frame from a year ago. These ranges should eliminate any “it’s new” effect and they don’t include the issue with direct traffic from above. On the other hand because more time has elapsed it’s possible the contribution from things other than the design is greater.
I’m happy to report that these longer range numbers are even better than the more immediate numbers above.
|Nov 24, 2012 – Dec 24, 2012 vs Nov 26, 2012 – Dec 26, 2011|
|Pages/Visit||Bounce Rate||Visit Duration|
Once again all three metrics show improvement overall, with visit duration having the most significant change. Across the board the overall numbers are better than the immediate numbers above, which is even more significant considering the change is being compared against better starting numbers.
For example the avg. visit duration from a year ago was 1:07 while it’s 1:41 now. Just prior to the redesign the duration was 1:03 and just after it was 1:28. Had I compared numbers from this month to just prior to the redesign, visit duration would show a 91% increase as opposed to the 51% increases it shows above.
Again the numbers look better when digging deeper into each traffic source. Search traffic numbers drag the overall down some, which I expected and think is what we’d see in the month to month comparison had the direct traffic issue not been present.
Without the weird 0.0s visits from me issue, the numbers from direct traffic are consistent with the overall numbers.
Referral traffic showed the largest increases across the board. Bounce rate dropped nearly 15% while pages/visit increased by a similar amount. Most impressive to me is that referral traffic is spending close to twice as much time on the site. It’s actually been above 100% depending on the dates I’ve checked.
Overall it’s hard not to be happy with the changes I’m seeing through Analytics. I would have liked to see bounce rate go down more and I’d like to have seen pages/visit increase more, but both did improve. I think they mostly get dragged down by the search data, which typically shows higher bounce rate and lower pages/visit for a site like this one.
I’ll try to remember to look at some similar numbers comparing the month prior to launching the new design last year to a similar time frame this year. I think that might also be interesting. It might also be revealing to compare mobile traffic data more. I didn’t here, because the numbers are generally too low last year to really determine much.
The next major challenge is to optimize the site for speed and try to determine what effect that has on these same set of numbers. I would think speeding up download times would also show a significant increase in time spent on the site and pages/visit, while seeing a decline in bounces. What will be interesting is to compare those increases to these.
Nothing in this post should be taken as proof that a new design alone leads to these improvements. Any number of things could have contributed to what I’m seeing above, though it’s hard to think the redesign isn’t the major contributing factor, especially in the immediate month to month comparison.
To me the above all suggests that design does matter and improving your design can improve how much visitors engage with your site and with you.
If you liked this post, consider buying my book Design Fundamentals