The simple answer is invest more time and money in identifying existing conversion holes and plugging them through data driven site improvements. Often ecommerce teams focus on the next big site enhancement, passing over basic UX flaws and conversion path inefficiencies, the impact of which compounds as traffic grows.
True optimisation requires a structured approach to using data to drive decision-making. By data, I mean the different data types that help you identify problems, generate hypotheses for the causes and put plans in place to test solutions and implement successful changes. Common data types include:
- Customer feedback – surveys, polls, reviews, Live Chat feedback etc.
- Web analytics – segmented analysis of page views, goals, conversions etc.
- Usability tests – in-person, remote video testing etc.
- Competitor benchmarking – using tools to provide a consistent comparison
The end goal is to operationalise testing to ensure there is a regular program of both small and large improvements. By operationalise, I mean creating a culture where testing is BAU (business as usual) and woven into core planning processes. For example, when the digital marketing team plans in a new email campaign, it should be flagging what tests can be done, why the tests are being done, how results will be measured and then using learning to improve future campaigns.
This means having a testing champion within the business who is skilled in analysis and can work across digital teams to assess current performance and identify areas of weakness/opportunity.
It also means measuring not reporting. Each week the ecommerce team should be presenting performance reports with clear recommendations that answer the following questions:
- What happened last week and how did it compare to the prior week and last year?
- Why did this happen and how do we explain differences?
- What did we learn?
- How can we use this learning to improve?
- What are our top actions to take away and who is the owner/what is the timeframe?
A report without learning/recommendations/actions is pretty much pointless.
To support this, there should be a regular assessment of the data tools supporting the business. Often web analytics implementations are flawed, with a generic solution rather than a configuration that aligns with the business goals and delivers the level of insight to drive action. Flawed data = flawed insight, which impedes website optimisation.
It’s essential that analytics tools support analysis, rather than dictate what analysis can be done. This demands specialist skill to ensure the technical implementation is correct, in terms of site-wide tagging and tool configuration. Sense checking the web analytics implementation as part of release unit testing is important; it’s not uncommon for tags to be over-written and data lost. You can only protect data integrity if there is clear ownership of data quality.
James Gurd is Owner and Lead Consultant at Digital Juggler, an ecommerce and digital marketing consultancy. He has more than 14 years’ ecommerce experience in B2C and B2B helping Clients with digital strategy, execution and optimisation.
More about James Gurd: