Advertisement

From web analytics to digital marketing optimization: Increasing the commercial value of digital analytics

  • Dave Chaffey
  • Mark Patron
Paper

Abstract

The use of web analytics to improve online marketing dates back to the 1990s when the first web analytics systems were developed. Yet, recent research suggests that many companies are failing to utilize core web analytics best practices and are therefore not getting the potential return from web analytics that they could. This paper reviews the opportunities for companies to better apply web analytics to improve digital marketing performance. An approach is defined to create a strategy to improve the value contributed by web analytics. The paper describes techniques that can be used to set up a digital marketing optimization programme, including a review of how people, process, measures and tools can be combined.

Keywords

digital marketing, digital analytics, web analytics, conversion rate optimization, user experience 

Introduction

The measurability of digital media has been heralded as one of its greatest benefits compared with other media since the mid-1990s when internet marketing, as it was known then, first started to be deployed. Many marketers realized that the capability to measure interactions of website visitors through log files provided previously unknown levels of insight into the effectiveness of marketing communications. In 1994, the first commercial web analytics vendor, I/PRO Corp, was launched, and since then Clifton 1 has catalogued the many web analytics tools that have been developed, including some that are still widely used today, such as WebTrends (1995), Omniture (2002) and Google Analytics (2005).

Levels of adoption of analysis techniques in web analytics

The majority of companies are not using fundamental web analytics approaches
Despite web analytics services being well established, it seems that web analytics technology is still not used as widely to positively impact marketing as might be expected. The adoption levels of the tools by companies are high, but the usage of them remains surprisingly low. A survey of 700 client-side and agency digital marketers carried out in July and August 2011 by Econsultancy-RedEye 2 showed the usage of analysis methods across the main areas of web analytics applied to conversion rate optimization (CRO) (Figure 1).
Figure 1

Methods used by companies to improve conversion ratesSource: Econsultancy-RedEye 2 . With permission of Econsultancy and RedEye

It is clear from Figure 1 that the majority of companies surveyed are not using fundamental web analytics approaches such as customer journey analysis and segmentation. This is the case despite the majority of users of these techniques surveyed in the report rating them as highly valuable or quite valuable.

The Econsultancy-RedEye 2 report also reviewed how companies using web analytics rated their practices (Figure 2). The majority of companies surveyed felt they needed to improve their analytics activities across the practices surveyed, such as identifying key performance indicators (KPIs), funnel analysis, mining internal search data, and integrating user testing and analytics.
Figure 2

Usage of alternative digital analytics practicesSource: Econsultancy-RedEye 2 . With permission of Econsultancy and RedEye

Barriers to effective use of web analytics

Two most significant barriers are the lack of resources and budgets

We have seen that many core web analytics techniques are not used as widely as might be expected given the level of adoption of the tools. What then are the barriers that may be preventing this? The Econsultancy-RedEye 2 report gives some evidence of this through evaluating the barriers to CRO, one of the main applications of web analytics.

Figure 3 shows that the two most significant barriers are the lack of resources and budgets. This suggests that CRO and web analytics may be considered a lower business priority than some other marketing activities. For example, for every US$80 spent on driving traffic to websites companies spend only $1 converting that traffic, according to Omniture. The chart also suggests that there may be some issues related to how web analytics is managed, since company culture, conflict of interest between departments and a siloed organization are frequently mentioned as contributors. Just under a third (28 per cent) of companies surveyed by Econsultancy and RedEye say methods used to improve conversion rates are looked after by a number of different departments. According to more than a third (37 per cent) of these organizations, findings are not combined for an overall analysis. It seems that web analytics is not being integrated with other means of improving conversion. If web analytics is to move towards full digital optimization, it could be argued that marketers will need to overcome these issues of ownership and integration.
Figure 3

Barriers to preventing organizations from improving conversion ratesSource: Econsultancy-RedEye 2 . With permission of Econsultancy and RedEye

The research in Figure 3 was first undertaken in 2009. Since then ‘Poor technology’ has moved from third to seventh place, or 25 per cent to 18 per cent, and ‘Poor integration between systems’ has moved down from fourth to eighth place, or 24 per cent to 17 per cent. This indicates that technology is becoming less of a barrier to optimization.

While technology has declined in importance, people and process issues have increased. When the Econsultancy and RedEye survey was analysed, the top four variables most strongly correlated with improved website conversion were in order of importance:
  1. 1

    Perceived control over conversion rates.

     
  2. 2

    Having a structured approach to CRO.

     
  3. 3

    Having someone directly responsible for CRO.

     
  4. 4

    Incentivizing staff based on conversion rates.

     

Companies with a structured approach to conversion were also twice as likely to have seen a large increase in sales over the previous 12 months.

The Web Analytics Association Outlook Survey 3 also gives insights on the challenges of deploying web analytics. Its members rated the top issues as
  • Acting on the data to improve site performance (69.2 per cent)

  • Business decisions driven by analytics (63.5 per cent)

  • Best practices implementation (48.1 per cent)

  • KPI development (40.6 per cent)

  • Developing process/implementing process (40.1 per cent)

  • Executive management awareness and support (41 per cent)

  • Integration of current and new solutions (36.7 per cent)

I don’t know where to go to find information to show what's working and what's not
The frustrations are expressed in a more human way by these responses from interviewees who are managing analytics:

There's just me as web analyst and 60 users of Google Analytics. They don’t dedicate time to using it. They’re too busy running campaigns …

Web analyst, financial services

It's very frustrating. I’ve no time to go into any great depth. I don’t know where to go to find information to show what's working and what's not

Brand manager, Pharma company

I use it primarily for monthly or weekly reporting. Due to the amount of data and lack of time I don’t have time to do more

B2B E-commerce manager

It may be that not enough marketers have the skill sets to analyse the masses of online data available. Many simply use Google Analytics to track numbers of visitors without getting underneath the data to understand why the numbers have gone up or down. Knowing why makes it possible to do something about it. Success depends on focusing on reducing inefficiencies in digital marketing that already exist, rather than running off reports when requests for information arise. The people that best prove the value of analytics are those that can converse with senior managers and understand their problems, and then provide solutions through analytics. Without this, the senior managers would never have thought to ask the web analyst to help them. Avinash Kaushik's 90:10 rule that you should spend $90 on people (including education and training) for every $10 spent on analytics tools is as relevant today as when he first said it in 2006.

Creating a strategy to improve the value contributed by web analytics

We believe that to develop a digital marketing optimization strategy that increases the contribution of web analytics to organizations requires careful consideration of how commercial value is generated through web analytics now and into the future. A suitable starting point is to review how the potential value from applying web analytics contributes to commercial performance, and then compare this to current capabilities and value generated. These can then be reviewed against a matrix of current and future value, as shown in Figure 4.
Figure 4

Matrix for evaluating web analytics activities against potential future value to organization Source: SmartInsights.com (with permission)

Setting the scope and purpose of web analytics

To scope the contribution of web analytics, it is also worthwhile to clarify the meaning and scope of web analytics within a company and show how it contributes to commercial success. The well-established definition of web analytics from the Web Analytics Association can be used as a starting point for this:

Web Analytics is the measurement, collection, analysis and reporting of Internet data for the purposes of understanding and optimizing Web usage.

Insufficient linkage to commercial value or achieving business goals

We would suggest that because of its source and age, some improvements to the definition are suggested. The emphasis on understanding and optimizing the effectiveness of marketing activities is sound, but there is arguably insufficient linkage to commercial value or achieving business goals. The reference to ‘web usage’ tends to silo web analytics activities, such that it does not relate to all online marketing activities. For instance, it does not clearly relate to social media marketing activities or integration with digital marketing with all activities. It may be the case that to increase adoption of web analytics in organizations it had to be repositioned as ‘digital marketing optimization’ — this gives a wider range of influence integrated across all online marketing activities, but using the core approaches of reporting, analysis, testing and improvement. Note that in 2012 the Web Analytics Association was renamed the Digital Analytics Association, giving a more relevant scope for the future application of web analytics.

To scope web analytics activities, the results of the Econsultancy-RedEye 2 research can be helpful together with the Web Analytics Association Outlook Report. 3 In this, 570 respondents were asked: What is the purpose of web analytics as a function in your organization? Multiple responses were possible; the following are the main activities identified in order:
  • optimizing website functionality and conversion (79.7 per cent);

  • analysis of past performance (73.7 per cent);

  • optimizing performance of and conversions from marketing campaigns (67.3 per cent);

  • determining the best creative executions through A/B and multivariate testing (49.8 per cent);

  • baseline information for site redesign (48.6 per cent);

  • predictive metrics for developing future marketing campaigns (41 per cent);

  • budgeting and planning for upcoming business objectives (32.7 per cent);

  • other (5.6 per cent).

When interpreting these responses it should be borne in mind that respondents are members of the web analytics association, with over a third of respondents describing their role as ‘web analyst’. This means that the perspectives are often those within this specialized function that often only exists in larger organizations. The analyst role often focuses on reporting and optimization of the company website presences, and analysts may not have a wider view of digital communications activities to increase reach and generate demand of an organization. This is evident in the responses where website functionality and conversion is rated as the most important activity. Although optimizing performance of campaigns is in third place, this has a perspective of driving visitors to a site rather than engaging audiences and monitoring reputation through off-site social media and publisher presences.
Increased role for marketing optimization required
We suggest that to give marketing optimization activities an increased role in improving marketing performance, the scope of web analytics should be broadened beyond its traditional focus on website optimization to other forms of digital media. A current model for considering this is the increasingly adopted model of earned, owned and bought media (Figure 5). This compilation by Chaffey (2011) shows that the traditional role of web analytics is within owned media and measurement of bought media (Figure 5). We are starting to see organizations extending their focus to consider the increased role of social media. For example, Dell and Gatorade announced in 2011 the creation of ‘social media command centres’ to harness and act on customer insight available through social media as part of reputation management and outreach.
Figure 5

Management of different types of media required in digital marketingSource: SmartInsights.com (with permission)

Benchmarking of web analytics capabilities

Capability maturity models valuable for benchmarking adoption of digital analytics
We have mentioned the value in reviewing current web analytics capabilities to benchmark a company's capabilities against best practice and provide a baseline for future improvements. In order to benchmark capabilities, one should consider a range of factors that when combined will lead to more effective web analytics processes. Figure 6 shows how correct selection of metrics, tools, people and process is necessary for optimization.
Figure 6

Key capabilities required for analytics-driven performance improvementsSource: SmartInsights.com (with permission)

Stephane Hamel has been active in suggesting and applying frameworks for benchmarking the application of web analytics. 4 He suggests that capability can be reviewed in these six areas:
  1. 1

    Management, Governance and Adoption: This includes defining clear responsibilities for web analytics.

     
  2. 2

    Objectives definition: Are the goals for web analytics clearly defined?

     
  3. 3

    Scoping: Determining whether the focus is on conversion optimization or digital marketing more broadly, as discussed in the previous section of this paper.

     
  4. 4

    Analytics team and expertise: Review of current capabilities within the team and how web analytics expertise occurs more broadly within the organization.

     
  5. 5

    Continuous improvement process and analysis methodology: The approach to improving results on a continuum from the role of web analytics as a reporting function to a more proactive, responsive approach to driving performance.

     
  6. 6

    Tools, technology and data integration: Providing the technical infrastructure formed of software and the integration of data from different sources to provide the reports and visualizations needed.

     

Another perspective on benchmarking is provided by the earlier Econsultancy-RedEye report. 5 This shows four different levels of capability based on the techniques used for analysis and by implication the availability of resources inside or outside the company to deliver this analysis.

Figure 7 is a simple capability model similar in approach to the Carnegie Mellon Capability Maturity approach (http://www.sei.cmu.edu) for optimizing information systems. This can be usefully applied to other improvement processes. For reference, the CMM maturity profiles are Initial, Managed, Defined, Quantitatively Managed and Optimizing.
Figure 7

An assessment of the maturity of web analytics and optimization capabilitySource: Econsultancy-RedEye 5 . With permission of Econsultancy and RedEye

Improvement process

Structured improvement processes required

Hamel 4 notes the need for a continuous improvement process and analysis methodology for making improvements. We suggest that as a starting point for developing an optimization process that works in an organization, we can use other established approaches to process improvement for reviewing and improving business performance.

For example, the overall approach of improvement using Six Sigma for quantitative business process improvements has also been suggested as suitable for application to web analytics. 6 The DMAIC (Define-Measure-Analyse-Implement-Control) improvement framework can also be applied. Truscott 7 has a useful elaboration to these eight stages of implementing a Six Sigma project:
  1. 1

    Identify the project.

     
  2. 2

    Define the project.

     
  3. 3

    Measure current process performance.

     
  4. 4

    Analyse current process.

     
  5. 5

    Develop the improvements, pilot and verify.

     
  6. 6

    Implement the changes, achieve breakthroughs in performance.

     
  7. 7

    Control at new level; institutionalize to hold the gains.

     
  8. 8

    Communicate new knowledge gained; transfer solution to similar areas.

     
Decker 8 has described how he applied a Six Sigma-DMAIC approach when managing the Dell.com website. Econsultancy-RedEye 2 has recommended a simplified version of the DMAIC approach applied to using web analytics to improve conversion rates (Figure 8).
Figure 8

Recommended process for deriving improvements from digital analyticsSource: Econsultancy-Econsultancy-RedEye 2 . With permission of Econsultancy and RedEye

The RedEye approach emphasizes the importance of setting clear business goals and KPIs as the starting point of the improvement process and creating a testing plan to prioritize tests. We believe it is the lack of a structured testing plan that is one of the main reasons web analytics is not applied in a more consistent way.

Selecting the appropriate metrics and measurement frameworks

The improvement processes we have described require early definition of measures that are used to review and optimize online marketing performance. As so many potential measures are available for online marketing, it is essential to group and categorize measures in such a way that the right measures are used to drive business performance. Thus, it is useful to distinguish between performance metrics used to evaluate and improve the efficiency and effectiveness of marketing process and KPIs. KPIs are an important category of measure as they show the overall performance of a process and its sub-processes. Performance metrics or performance drivers are typically more granular measures that are used to evaluate and improve the efficiency and effectiveness of marketing activities.

Developing a relevant measurement framework

Grouping measures within a measurement or KPI framework is essential in making the analysis of measures collected through web analytics relevant to different types of people within a company. Senior business managers and marketing directors will need commercial measures, while others working on customer acquisition, conversion or retention will require different measures.
Measurement frameworks are needed to simplify and visualize KPIs

There has been much debate about companies needing to develop a web analytics culture to get the most out of it. A significant aspect of this is working on the problems that exist rather than running off reports when requests come in. The people that best prove the value of analytics are those who can converse with senior managers and understand their problems, and then provide solutions through analytics; without this the senior managers would never have thought to ask the web analyst to help them.

Measurement frameworks define and group different types of measures needed to review and improve performance. Measures are grouped so that the most important measures that drive performance are defined. Different measures can be reviewed by different team members for areas related to their responsibility, and measures relating to the management of different marketing activities are grouped together. The measurement framework can be used to create physical reports and on-screen dashboards that are structured in a similar way and provide ‘drilldown’ to more detailed analysis to inform decisions.

The challenge of reviewing and actioning across the range of measures related to digital marketing optimizations is suggested by these different types of measures:
  • Commercial measures: Revenue, different cost types, profit and margin.

  • Contact volume and reach measures: Number of prospects, customers, site visitors, fans, followers and subscribers.

  • Quality of interaction measures: Bounce and conversion rates, engagement measures and hurdle rates.

  • Media cost measures: Cost per click, cost per thousand, cost per acquisition.

  • Customer value measures: Average order value, revenue per visit or contact, and longer-term assessments of lifetime value.

  • Customer sentiment measures: Customer satisfaction, Net Promoter Score, sentiment from social listening.

  • Long-term relationship measures: Lifetime value, loyalty measures and hurdle rates.

  • Multi-channel measures: Channel influences, for example sales generated off-line from online presence and off-line from online presence.

  • Media type measures: Breaking down performance by media type of paid, owned and earned as explained in the first section.

  • Marketplace measures: Data collected showing relative performance in the online marketplace against competitors.

  • Brand measures: Awareness, Familiarity, Favourability, Purchase intent.

Selection of appropriate usability and selection tools can improve performance

Without auditing the types of business, consumer and market insight collected, there is a danger that key types of insight will be missed. For example, if a company is overly reliant on web analytics from a tracking system, they may miss more qualitative data about customer satisfaction or sentiment such as Net Promoter Score. They will know ‘what’ web visitors are doing through their clickstreams, but will not understand the motivations, the ‘why’ of how they act in a particular way.

For example, the Econsultancy-RedEye 2 survey found that for companies whose revenues were over £10 million, those who integrated user testing and analytics well were twice as likely to have seen a large increase in sales as those that did not. Similarly, companies whose conversion rates had improved over the previous 12 months were using on average 26 per cent more methods to improve conversion than those companies whose conversion rates had not improved. We suggest that companies can review their use of relevant tools to gain alternative types of insight by comparing them as shown in Figure 9.
Figure 9

Range of methods used for improving performanceSource: Patron (2011). With permission of RedEye

Taking this theme of combining complimentary methods further, RedEye has developed a patent pending process to integrate web analytics data with other data sources for better digital marketing optimization and prioritization of tests. 9 Using analytics, usability and other data sources, information is gleaned on three independent elements: web analytics analysis of the website content identifies what areas of the site have the greatest influence on the end sale; segment analysis identifies what user groups are most valuable, for example repeat visitors/registered visitors; and the final element, customer core journey analysis, identifies the most valuable user journeys critical to business interests and goals.

RACE: A model framework for optimizing online marketing performance

In addition to defining the right people, tools and process, a suitable method of reviewing relevant KPIs that improve digital marketing is also essential.
The RACE Performance framework covers all customer touchpoints

An appropriate KPI framework should clearly distinguish between evaluation of customer acquisition, conversion and retention for reporting and analysis of the effectiveness of marketing activities for those responsible in each area. 10 It should also define different classes of measures from operational to strategic importance. Dave Chaffey developed the RACE framework to meet these needs. 11 RACE is an evolution of the REAN (Reach, Engage, Activate, Nurture) framework for web analysts originally developed by Xavier Blanc and popularized by Jackson. 12 Both authors separated out conversion activities into generation of leads or interactions with an online presence and then conversion to sale. The framework is thus appropriate for marketing activities where lead generation and conversion to sale can be protracted, as is the case with insurance or many business-to-business services, for example.

RACE (Table 1) consists of four steps or online marketing activities designed to help brands engage their customers throughout the customer lifecycle.
  • Step 1 — Reach: Reach means building awareness of a brand, its products and services on other websites and in off-line media in order to build traffic by driving visits to different web presences such as a company website, microsites or social media sites. It is important that assessment of the contribution of media referrers to leads and sales is based on a suitable attribution model based on the customer touchpoints before sale, 13 rather than the ‘last click wins’ model, which ascribes value only to the most recent visit before lead or sale.

  • Step 2 — Act: Act is about persuading site visitors or prospects to take the next step of interacting on their journey when they initially reach a site or social network presence. It may mean finding out more about a company or its products, searching to find a product or reading a blog post. It is about engaging the audience through relevant, compelling content and clear navigation pathways so that they do not hit the back button.

  • Step 3 — Convert: Conversion is where the visitor commits to forming a relationship that will generate commercial value for the business. Conversion is to marketing goals or outcomes such as leads or sales generated online or offline.

  • Step 4 — Engage: Building customer relationships over time to achieve retention goals through activities such as E-mail and social media marketing.

Table 1

RACE analysis

Metric Overall visits or broken down by channel

Reach Audience

Encourage Action

Convert to sale

Engage customers to retain and grow

Tracking metrics

• Unique visitors

• Online opportunity (lead) volume

• Online sales volume

• E-mail list quality

• New visitors

• Off-line opportunity (lead) volume

• Off-line sales volume

• E-mail response quality

• Visits

• Transactions

• Conversation volume

Performance drivers (diagnostics)

• Share of audience compared with competitors

• Bounce rate and duration measures

• Conversion rate to Sale

• Active customers percentage (site and E-mail active)

• Share of search

• Macro-conversion rate to opportunity and micro-conversion efficiency

• E-mail conversion rate

• Active social followers

• Brand/direct visits

• Repeat conversion rate

Customer-centric KPIs

• Cost per Click and per Sale

• Cost per Opportunity

• Cost per Sale

• Lifetime value

• Conversation polarity (sentiment)

• Customer satisfaction

• Customer satisfaction

• Customer advocacy index (eg Net Promoter Score) • Customer loyalty index

• Brand awareness

• Products per customer

Business value KPIs

• Audience share (owned media)

• Goal value per visit

• Revenue per visit

• Retained sales growth and volue

• Share of voice (conversations)

• Online product requests (n, £, percentage of total)

• Online-originated sales revenue and profit (n, £, percentage of total)

• Revenue per active customer

RACE covers all the main measures to consider across the customer lifecycle (in columns) and different levels of reporting depending on who is reviewing performance (in rows).

Some companies prefer to use alternative frameworks for management of processes that have a business rather than marketing focus. The balanced scorecard approach introduced by Kaplan and Norton 14 is the best known of these, with performance assessed through four areas: customer concerns, internal efficiency measures, financial measure, and learning and growth (innovation).

Regardless of the framework used, each measure should be evaluated for its suitability by testing its relevance. A comprehensive method of KPI assessment has been suggested by Neely 15 :

The ten measures design tests

  1. 1

    The truth test: Are we really measuring what we set out to measure?

     
  2. 2

    The focus test: Are we only measuring what we set out to measure?

     
  3. 3

    The relevancy test: Is it the right measure of the performance measure we want to track?

     
  4. 4

    The consistency test: Will the data always be collected in the same way whoever measures them?

     
  5. 5

    The access test: Is it easy to locate and capture the data needed to make the measurement?

     
  6. 6

    The clarity test: Is any ambiguity possible in interpreting the results?

     
  7. 7

    The so-what test: Can and will the data be acted upon?

     
  8. 8

    The timeliness test: Can the data be accessed rapidly and frequently enough for action?

     
  9. 9

    The cost test: Is the measure worth the cost of measurement?

     
  10. 10

    The gaming test: Is the measure likely to encourage undesirable or inappropriate behaviours?

     

Conclusion

To increase the adoption of web analytics in organizations it needs to be repositioned as an improvement process such as ‘digital marketing optimization’ or the narrower ‘CRO’.
People and Process are the main barriers in the application of digital analytics

Technology and data integration challenges are becoming less of the major barrier they were in preventing companies from improving website conversion rates. The challenge is becoming one of people and processes.

The digital industry lacks enough experienced people. A lot of this seems to stem from companies not investing enough in people and wasting too much on tools they do not use or media that does not work. Sadly, when the cost of acquiring a customer online is often much better than off-line, there is little incentive to change this. As results for online marketing inevitably converge with off-line results, the incentive for companies to be more disciplined with their online marketing will drive the required change.

Companies should review their structure and their investment in web analytics and digital marketing optimization to make sure opportunities are not falling through the cracks.

References

References and Notes

  1. Clifton, B. (2008) ‘Web analytics: Web traffic data sources and vendor comparison'. Omega Digital Whitepaper, available at http://www.advanced-web-metrics.com/docs/web-data-sources.pdf, accessed 1 February 2012.
  2. Econsultancy-RedEye2 Conversion Rate Optimization Report. Published in October 2011.Google Scholar
  3. Web Analytics Association. (2011) ‘Outlook Survey Report' Published in February.Google Scholar
  4. Hamel, S. (2009) ‘Online analytics maturity model (OAMM) paper'. available at http://immeria.net/oamm/paper.htm, accessed 1 February 2012.
  5. Econsultancy-RedEye. (2009) ‘Conversion Report'. Published in October.Google Scholar
  6. Kaushik, A. (2007) Web Analytics: An Hour a Day, Wiley, Hoboken, NJ.Google Scholar
  7. Truscott, W. (2003) Six Sigma: Continual Improvement for Businesses, Butterworth Heinemann, Oxford, UK.CrossRefGoogle Scholar
  8. Decker, S. (2006) ‘Marketing Bullseye 2: Think Six Sigma Blog post', 24 July, available at http://decker.typepad.com/welcome/2006/07/marketing_bulls_1.html, accessed 1 February 2012.
  9. Gibbins, C., Lee, G. and Patron, M. (2012) ‘RedEye conversion rate optimization dashboard', RedEye Whitepaper, January.Google Scholar
  10. Chaffey, D. (2001) ‘Optimising e-marketing performance — A review of approaches and tools', in Proceedings of IBM Workshop on Business Intelligence and E-marketing. Warwick, 6 December.Google Scholar
  11. Smart Insights. (2010) ‘Introducing RACE=A practical framework to improve your digital marketing'. Blog post by Dave Chaffey, 15 July, available at http://www.smartinsights.com/blog/digital-marketing-strategy/race-a-practical-framework-to-improve-your-digital-marketing/, accessed 1 February 2012.
  12. Jackson, S. (2009) Cult of Analytics, Butterworth-Heinemann, Oxford.Google Scholar
  13. Lee, G. (2010) ‘Death of “last click wins”: Media attribution and the expanding use of media data’, Journal of Direct, Data and Digital Marketing Practice, Vol. 12, No. 1, pp. 16–26.CrossRefGoogle Scholar
  14. Kaplan, R.S. and Norton, D.P. (1993) ‘Putting the balanced scorecard to work’, Harvard Business Review, (September–October), pp. 134–142.Google Scholar
  15. Neely, A., Adams, C. and Kennerley, M. (2002) The Performance Prism: The Scorecard for Measuring and Managing Business Success, Financial Times/Prentice Hall, Harlow, UK.Google Scholar

Further Reading

  1. Gibbins, C. (2011) ‘Unlocking the true value of CRO', RedEye Whitepaper, March.Google Scholar
  2. Kaushik, A. (2006) ‘10/90 Rule', available at http://www.kaushik.net/avinash/the-10-90-rule-for-magnificient-web-analytics-success/, accessed 1 February 2012.
  3. Nakatani, K. and Chuang, T. (2011) ‘A web analytics tool selection method: An analytical hierarchy process approach’, Internet Research, Vol. 21, No. 2, pp. 171–186.CrossRefGoogle Scholar
  4. Sharma, R.S. and Dijaw, V. (2011) ‘Realising the strategic impact of business intelligence tools’, Vine: The Journal of Information and Knowledge Management Systems, Vol. 41, No. 2, pp. 113–131.CrossRefGoogle Scholar
  5. Patron, M. (2011) ‘A structured approach to conversion rate optimization'. Whitepaper published at Redeye.com, October 2011.Google Scholar
  6. Smart Insights. (2011) ‘Your new, new media options'. Smart Insights blog post, by Dave Chaffey, 11 July.Google Scholar
  7. Web Analytics Association. (2011) ‘Outlook Survey Report', Published in February.Google Scholar
  8. Web Analytics Association. (2011) ‘Definition of web analytics', available at http://www.webanalyticsassociation.org/?page=aboutus, accessed 1 February 2012.
  9. Wilson, R. (2010) ‘Using clickstream data to enhance business-to-business website performance’, Journal of Business & Industrial Marketing, Vol. 25, No. 3, pp. 177–187.CrossRefGoogle Scholar

Copyright information

© Palgrave Macmillan, a division of Macmillan Publishers Ltd 2012

Authors and Affiliations

  • Dave Chaffey
    • 1
  • Mark Patron
  1. 1.Smart Insights LimitedLeeds West YorkshireUK

Personalised recommendations