How to Optimize Google AdWords? -Data Driven
My articles are first published in public, and then synchronized to CSDN, Zhihu, and my personal blog:haiwaibiji.com.
How to optimize your Google ad placements? This is something that many newbies will want to know. InGoogle Placement Ads, no matter what way you go about it, what ends up on the ground, is data.
Therefore I, very much recommend to start with the data, to do advertising optimization, rather than according to the so-called brand ah, the user's mind what to do, I can be very blunt, are a waste of money, waste of time.
I. Clarify the goals of data analysis
Whether you have a problem, or a business need to do data mining, data analysis, you need to first define the goal of your analysis.
Objectives should be met as far as possiblethe SMART lawThe four items of smar in , are required.

If the target cannotSpecific.
Then discuss more with your superiors, with your peers, if there are other ways to be more clear about our goals.
For example, advertising is used to achieve a strong branding effect.
What is a strong brand effect and how are strong metrics? This is not a clear goal.
If the target cannotMeasurement (measurable)
Take unquantifiable metrics and de-index them in a sensible way, while ensuring that the goal is the right one. Usually analytics, labeling, and other statistical methods are used to do the dimension splitting.
For example, the development of potential talents.
What is potential? Potential can be divided into learning ability, imitation ability, summarization, logical expression ability, etc. For each dimension of ability, you can do a detailed test questions and ability to examine the score.
If the target cannotAchieveable
To change your thinking; either by going outward to expand your resources or inward to sort out what can be done.
For example, IOS 14 breaks the device tracking ID.
Can we look to external appsflyer for a solution or adjust our internal attribution issues. Of course there is the option of complaining to Apple, but the probability of this being realized is very low.
If the target is notRelevance
Analysis may have some conclusions or within the same do not understand, blindly determine the goal, it will be deviated from the actual situation, then it must be strong with the target relevance of the department, personnel, business to do a good job coordinating and cooperating with more, conducive to a better understanding of the goal.
For example, if advertising suddenly runs wild in Brazil, whether it is related to the Brazilian business or not, then you need to go for a synergistic understanding.
Overall.The goal of data analysis.It's about finding a clear, measurable, and at the same time changeable fact that is within our power to change through data logic.
And our subsequent efforts to do so are capable of influencing the course of that fact.
II. From business processes
Process Idea #1: From Big to Small
Sorting out the business process from market-overall-channel-projects-individuals.
Process Idea 2,: Meet MECE, Pyramid Principle
Starting from the horizontal and vertical dimensions, we do not overlap and ensure that each node is connected.
Process Idea 3: Capture key points and clarify the scope of data impacts
Doing no heavy lifting means that there are a large number of nodes that need to be attended to, but not every node requires effort. Because each piece of data represents a limited number of segments of influence, it is important to identify the key nodes in each segment, as well as to be clear about the scope and boundaries of their influence.
You, in front of the screen, can extrapolate to the sandbox below, and if you were to analyze it, how would you analyze it?
Analyze the sandbox
BACKGROUND: Individuals Brazilian Google Ads ROI appears to be abnormally elevated, rising about 180%
Analyzing the flow of ideas
level | View Content | state of affairs |
market (also in abstract) | Is the overall market volatile | anomaly-free |
website | Is the Brazilian channel fluctuating | anomaly-free |
AF attribution status | Is the AF attribution situation abnormal | anomaly-free |
SEM Channel | Does Brazil fluctuate a lot | unusual |
SEM Channel | Is the entire channel volatile | unusual |
SEM segmentation channels | Is the Google channel the only anomaly in the segmentation channel | SEM general anomaly |
Brazilian synergy | Is the Brazilian synergy abnormal? | anomaly-free |
personal advertisement | Is Brazilian advertising unusual | anomaly-free |
Reasonable inference: market, general market, operation, advertisement, there is no abnormality, that may AF data attributed to internal data abnormality, need technical troubleshooting.
III. Three axes of data analysis
This three-pronged approach is a well-recognized analytical process in the industry
1, Comparison
Compare the changing data, what are the specific points of difference under the same dimensions (time, country, delivery method, same cycle), how large is the change, and is it outside the empirical range.
An example: the recent high temperature fluctuations.
Then count the difference in temperature between morning and evening, compared to the daily temperature difference, to know if the temperature fluctuation has been high recently.
2, Breakdown
A, where data anomalies are clearly identified, by business composition, one by one
For example, ROI continues to decline.
Check if the ads are working properly, check the national ROI downward trend, and check the gap between the actual ROAS values. In case there is nothing unusual, then check the landing page, keywords.
B, for the structural composition of the anomalous data, the differences between different data structures under the same indicator are broken down one by one.
For example, a particular ad spend anomaly.
The structures that make up this spend are timeliness, material, ad audience, keywords, etc. By dismantling the spend structure of abnormal spend time and clarifying which of these segmented structures are abnormal, you can check which issues are the root cause of the major impact on the abnormal data.
3, Traceability
When it is clear that the segmentation of the business, data structure on the problem, you can be based on the anomaly to do traceability analysis. This analysis is mainly to find the factors affecting the segmentation of data anomalies to do the split.
Case in point:
BACKGROUND: In October 2020, several SDS ad spend anomalies.
Contrasts, breakdowns:
After splitting it up it was found that the keyword costs in the UK suddenly skyrocketed at 1:00 midnight resulting.
Traceability:
The keywords were analyzed and it was found that keyword 80% spend was concentrated on keywords such as iphone12, iPhone, and other keywords that were extremely irrelevant to the ads.
Operation:
All global ads on hand are drained of relevant spike keywords. Report exceptions to the Google channel to check why the unusual keywords are triggering ads.
IV. Data validation
When the data analysis has come to a conclusion, two scenarios commonly exist;
1, data misclassification
The operations resulting from the conclusions drawn from the data analysis do not solve the actual problem.
For example: from the perspective of data analysis found that Google bids ROAS and advertising ROI has a significant positive correlation; but in practice, by increasing the advertising ROAS bid, not necessarily appear advertising ROI growth, on the contrary, may decline.
This is actually the same problem of distinguishing between correlation and causation in data analysis.
We can intuitively get the correlation between the two sets of data, but whether they are causal or not, we need more dimensions and data to make a validation judgment.
This is one of the difficulties on data analysis. So after doing data analysis, make sure you do data validation to speculate if your conclusions are correct and if you have misjudged to.
2, attribution is complex
As mentioned above, the reason it is difficult to verify causality between data is because there are multiple causes of an outcome, and even the extent of their influence varies.
As a result, we often find several factor conclusions in our analysis that have a large impact on our goals.
Based on the above two points, after the conclusion of the data analysis, not be able to hastily put a lid on things, you need to quickly make a number of data verification of the results of the data analysis in the subsequent operation:
One to detect if the data analysis is misjudged;
Secondly, it is important to test the scope of influence of each subdivided attribution of the data and its level of importance.
V. Documentation
All the data situation, data analysis, data validation process, it is best to maintain the habit of recording documents, excel data, for each link of the degree of impact to make records document.
It is beneficial to quickly locate the position when something goes wrong; it is beneficial to project matching; and it is beneficial to more professionally control every aspect of the business.
Above is, Google's idea of optimized ad placement, data-driven.
It's really only a throwaway because actually the ways and means mentioned here are commonly used when doing research, analysis, and disciplines.
Simply put, it's about treating each placement, as a period of experimentation, rather than treating each one as if it has to be a high-return placement.
Of course, it's not for everyone.
Very often, pitchers are smashed out with money, because they know the pain, know which way is locked, they are determined not to do it, but this kind of mentality and ideas, is the need for countless ROI of 1, for zero point a few advertisements smashed out, which will make a lot of bosses to turn away.