Combine Analytics, BiqQuery and machine learning to improve your online audience targeting.
How can I optimise my marketing efforts by identifying people with a high propensity to buy?
This is a question many marketing teams have. At the end of the day there is only a limited amount of money, time and effort you can put into your re-marketing activities. Wouldn’t it be great to do a large-scale analysis that will identify users with a high propensity to buy upon returning to your website? The good news is that this can be achieved with Google Analytics (GA) and Google Cloud.
“Business/data analytics and data storage/data management (both 43%) are projected to lead cloud adoption in 2017 and beyond.”
For most marketers, Google Cloud is a complicated platform that only highly tech-savvy individuals or teams can use. With features like Machine Learning, automation, virtual machines and Big Data most people think they need a PHD in Computer Science just to operate it. The truth is the Cloud is easy to access and use and can help identify potential buyers amongst all your website’s users.
By combining Google Analytics and Google Cloud, you can automate a process that will use Machine Learning to classify users into groups of buyers and non-buyers based on their website behaviour. These results can be fed back into Google Analytics where they can be used in your re-marketing activities with DoubleClick, AdWords, email or content personalisation with Optimize 360 (if you’re using Analytics 360).
What do you need to build a machine learning process?
To classify users, you will need to create a predictive machine learning model using Google Analytics, and Google Cloud. The actual process is easier than most people think. There are only four technologies that you need to make it work;
Google Cloud’s BigQuery
Google Cloud’s Compute Engine
Python / R
Data from Google Analytics is fed into BigQuery. Once in BigQuery, you prepare the data and move it to Compute Engine and execute your machine learning model. Finally, you send the data back to Google Analytics.
Once the data is in GA you can use any re-marketing method you want, DoubleClick, AdWords, email etc.
How do BigQuery and Compute Engine help build the machine learning model?
BigQuery is Google’s data warehouse for analytics that you can use with SQL. Once you collect your data via Google Analytics, BigQuery will be responsible for preparing the data for your machine learning model. Because BigQuery can scan 1TB of data in seconds, you can use a whole year’s worth of data or more to build your dataset for machine learning. This is an extremely useful tool for high speed data manipulation and analysis.
Compute Engine, a Google Cloud tool, will run the predictive model, putting the results back into Google Analytics. Compute Engine is a service Google provides that lets you create virtual machines. With a virtual machine (VM), you can automate the entire process and not have to worry about running the entire process by hand.
In the virtual machine, you can grab the dataset from BigQuery which has prepared the data, and execute that data within your model. The VM in Compute Engine and can be scaled to run faster if more power is required. The VM then sends the data back into Google Analytics via data import.
With these three technologies, the entire predictive model can be automated and tell you which of your users will likely buy upon their return to the website.
Sounds good, but how did we put it into practice?
icelolly.com, a holiday comparison site and Jellyfish 360 client, allowed us to implement a predictive machine learning model which classified users based on online behaviour into buyer and non-buyer categories. The model used a set of indicators that would predict future buying behaviour. The goal was to use the predictive results to re-target the users who are classified as buyers using advertising technologies like DoubleClick, Google Display Network, AdWords or Google’s Optimize 360 (via Analytics 360) tool.
There were two campaigns; one predictive model campaign and one controlled campaign.
The predictive campaign would target users the model identified as potential buyers.
The other would use a random segment of users.
Both campaigns had the same ad spend and creatives, so we could compare the two results fairly.
The results from the predictive model campaign had a higher conversion rate by 45% compared to the controlled campaign.
The overall engagement was also higher in the predicative campaign than the controlled campaign.
Results like these can be expanded upon by further breaking down users into more groups; from highly likely to buy, likely to buy, likely not to buy and non-buyer. Re-targeting becomes much more efficient and can save resources when targeting the right users.
Using Google Cloud:
The first thing to do is link your Analytics 360 data to BigQuery.
Fortunately, Google has a native connector for Analytics 360 accounts which hastens the process of the data extraction significantly & provides an unsampled, raw data feed.
At the Property level, there is an option to link your data to BigQuery.
Alternatively, for those that aren’t currently leveraging Analytics 360, we’d recommend linking the Core Reporting API with BigQuery directly.