This video is part of a series answering questions on topics brought up in Executive Director, Patrick Gilbert's new book, Join or Die: Digital Advertising in the Age of Automation - available now on Amazon.
Q: It is clear that the more data signals we get the better. However, for clients with limited budgets, would you still use straight broad keywords or Google Partners to start gathering data or would you only pick broad modified and Google SERP in order to be more conservative?
A: It's worth diving into more, but the short answer is that you want to strike a balance. You can add some restrictions, but you don't want to throttle potential volume too much.
Q: If smart bidding is working and you are hitting client goals, how do you grow such an account and show your agency is still able to add value?
A: The third section of Join or Die addresses this directly. Agencies aren't going anywhere... but the work that agencies perform for clients will change (it has already, for us at least).
Automation takes care of much of the tasks that used to fill up our entire calendars. This allows us to focus on solving more sophisticated client problems. We spend more time in the weeds with their business and analyzing industry trends... We might make fewer physical changes in an account, but the impact of the overall partnership goes much further. This is, in large part, thanks to our adoption of automation
Q: What're your thoughts on utilizing Google Ads for affiliate marketing? (Promoting high payout products like camping equipment for example.)
A: Don't have a ton of direct experience here, but I know that it's extremely popular. There's a lot of strict policies on this topic, so read up on it before you invest. For example, Google is very quick to identify "bridge pages" (pages that exist for the main intention of sending traffic to a different domain)... and this will lead to an account suspension.
Q: Is it better to tell Google to optimize for specific conversions using conversion action sets or optimize for all conversions? Which way is better to teach Google's algorithm?
A: If both of your conversion actions have similar values, then optimize for both. The general rule of thumb: The more conversion data, the better. But don't get greedy with your conversions that you're optimizing for... you want to keep the algorithms focused on your primary goals.
Q: One thing I'm still pondering, is whether, assisting the algorithm, or (as I've heard others describe it) helping put necessary guidelines in place to guide the algorithm really will kill the algorithm. I struggle to see how it has to be all or nothing (as appears to be suggested in the final minutes of the conversation). If I were to only listen to this video, I would walk away thinking what was presented is essentially, choose between manual, or never touching anything again in your campaigns. Did I understand that correctly, or is there some nuance there I missed?
My initial thoughts: based upon the fact that machine learning utilizes signals (including our human added signals) to help point it in the right direction, my understanding of our attempts to help the algorithm is not that it boggles and then shuts down the algorithm, but that it is then simply utilized as one of the many signals. I.e., (overused example) if we add a negative keyword, it doesn't completely kill the algorithm for the entire campaign... it simply helps guide the algorithm to be more targeted to what we are interested in identifying. Of course, it's possible for us to kill the whole algorithm with too aggressive of human intervention, for sure. I'm just not ready to say there is no place for human assistance in the algorithms. This seems to suggest infallibility to the algorithm, which I'm not ready to buy into. Though, certainly able to compute faster than we can. I'm curious to hear your thoughts on that tension?
A: Great points. We should have been more clear in our examples... but when we were talking about the threats of restrictions (negative keywords, etc.), it is really as it relates to abusing and/or adding unnecessary restrictions.
In short, there is still plenty of work to be done inside campaigns... These attempts to "guide the machine" are extremely important. However, they can be abused or mistreated. We shouldn't use this as an excuse when what we are really doing is just "tinkering" for the sake of "tinkering" (driven by an innate inability to keep our hands off of something).
There are still plenty of times where we add negative keywords. I do it every day. But I am also trying to be conscious of any potential negligent impact that any specific negative keywords might have.
We get a lot of questions in this Group about how to manage automation with smaller budgets. In these cases, I usually advocate for more restrictions, so that you can shrink the size of the data pool that Google is using to test for statistical significance. This is one of the key areas where we can guide the machine.
What we as humans are doing, in all cases, is creating the data environments in which these algorithms will learn
If we give them unlimited access to all data under the sun... they will likely learn incredible things. They would have a high degree of accuracy in their ability to predict successful outcomes and scale our campaigns.
But... it will take them relatively more time (and money) to learn those things. Many SMBs don't have the time and budget, so a different and more risky route often needs to be taken.
If you shrink the size of the environment BUT ensure that the data within that environment is still very valuable, then you can expedite the learning process. The downside, though, is that with less data, the algorithm won't have as much confidence to scale.
A great example here is the use of Facebook's Lookalike audiences. Most advertisers on a limited budget have seen success with 1% Lookalikes. However, they've also struggled to increase their budgets and retain a profitable CPA.
Targeting a 1% Lookalike, as opposed to just targeting everybody on Facebook, is a much smaller data environment. However, you're giving Facebook a great dataset to learn from... even if it's smaller.
If you flip a coin twice, and it lands once on heads and once on tails, you have enough evidence to assume that there's a 50% probability for both outcomes at scale.
However, what if both outcomes here happened to land on tails? Under the same assumption, you would mistakingly assume that 100% of outcomes will forever land on tails... but that isn't correct.
A smaller data environment could work ... but it could also not work. It's, um, a toss-up.
(Again, forgive the exaggerated example here, but...)
Assume Facebook learns that there are 10 key signals that are used to profitably convert users within your Lookalike audience. And it can do so at a budget of $100/day. It is likely that if you increased your budget to $1,000/day, those 10 signals would not be enough to continue to help you profitably convert at this scale.
But that doesn't mean that it's impossible for you to profitable convert traffic at $1,000/day. The algorithm just hasn't learned to do so yet, under the current data environment that you've presented to it.
Instead, if you had previously learned from a much larger audience, and therefore much larger data set, Facebook might have identified 50 key signals that can be leveraged.
It would have taken Facebook more time (and money) to statistically reach the point where it's confident in those 50 signals.... However, these learnings allow you to be profitable on a much larger scale.
To summarize, human intervention is relevant and important. We need to create the environments in which these algorithms can learn... and tradeoffs often need to be made (smaller data, but the hope of expediting an accurate algorithmic process). And we humans should be aware of these tradeoffs, and all the associated risks.
And in my opinion, too often do we make decisions without carefully considering the risks, hence my slightly exaggerated tone in the video.
You can find more information and resources on data signals in your automated or manual bidding strategy in Join, or Die: Digital Advertising in the Age of Automation
We'll get back to you within a day to schedule a quick strategy call. We can also communicate over email if that's easier for you.
Woodmere, NY, 11598
View On Map