Quality Signals that May have Affected the March 2017 Google Fred Update

Quality Signals that May have Affected the March 2017 Google Fred Update

Googler Gary Illyes declared that unless otherwise given a name by Google ALL of their search “updates” will be referred to as “Fred”.  He was joking but, honestly, I like that.  Some years ago I wrote a brief article titled “Your Name is Fred – Please Use It”.  This was a warning to the various random commenters on SEO Theory who were hand typing keywords and business names into the “name” field of our comment form.  At the time it was obvious that people were doing this because the automated tools were leaving very different comments.  “Fred” is a good generic name for anything that has to do with search quality.

Quality signals the March 7 Google Fred update may use

Quality signals the March 7 Google Fred update may use

But most of you still think of “Fred” as an update that Google released into the wild around March 7 or March 8, 2017.  Because they release 2-3 updates per week day Googlers were reluctant to confirm that anything special happened in early March.  In fact, Fred happens five days a week for someone.  You just won’t ever see an end to the list of complaints against Google about Fred because Fred is an ongoing phenomenon.

That said, Gary Illyes bowed to public pressure and confirmed that there was some sort of change to Google’s search system around March 7, 8.  They never denied there was an update.  All Gary said was, “Obviously there was an update.  Why would we deny that?”

Keeping in mind that there have been many Fred updates since March (at least 60), let’s take a look at what we have learned from Google since then.  If you want to skip the recap, the discussion of quality signals that may affect March 7 Fred follows in the section after that.

What We Have Learned about the March 7, 2017 Fred Update

On March 13 Barry Schwartz published a list of publicly confirmed “March 7 Fred victims” and the results of his analysis of those sites and many others that were privately shared with him.  Barry’s conclusion: “…95% of [sites he examined] share two things in common. The sites all seem content driven, either blog formats or other content like sites and they all are pretty heavy on their ad placement.”

You can browse Barry’s list of sites.  If you look at the sites in question, try to pull them up in Archive.Org prior to March 7 to see what they may have looked like prior to any changes made since then.  This should be the first thing you do when reading ANY SEO case study that names Websites because the sites are often changed substantially either by the time the case studies are published or soon after.

On March 27 Googler John Mueller confirmed in a hangout that the March 7 Fred update targeted quality issues:

Essentially, if you are following the Google guidelines and you are doing things technically right, then that sounds to me like there might just be just quality issues with regards to your site — things that you can improve overall when it comes to the quality of the site.

Which means there is no simple kind of answer, like there is no meta tag that would make your web site higher quality. It is just in general … you probably need to take a step back — get random people who are interested in that topic to review your site compared to other sites to kind of go through a survey to see what you can be doing better to improve the quality of your site overall. And ideally, don’t just like tweak things to kind of subtly improve the quality to see if you can get a little higher. Really take a stab at it and try to figure out what you can do to make a significant dent in the quality of the site overall.

You can see the video in Barry’s article (the link I provided above).  Notice how John emphasizes getting feedback from other people.  He implied very strongly that this update was targeting the user experience.

On April 1 Gary responded to a question on Twitter about whether the March 7 Fred was one update or several.  He reiterated that “Fred” is not just one update (in his mind) but all Google updates.  Nonetheless, he restated what should be obvious to everyone by now: there were several updates around that time.  So it’s highly probable that some small percentage of people who feel their sites were hit by the March 7 Fred update were, in fact, hit by March 7-a Fred, or March 6-a/b Fred, or March 8-a/b Fred.

How can you know whether you were hit by the one big thing or one of the others?  One test is to compare your site’s purpose and design to the purposes and designs of the publicly confirmed sites.  If the purpose of your site is substantially different from theirs; if advertising really isn’t the reason why your site exists, then maybe you were hit by a different early March Fred.

Finally, on June 14 Gary talked about the March 7 Fred again.  He again pointed to the Google Guidelines and said that Webmasters who feel they were affected by this algorithmic change should look at the Quality section in the guidelines.

A Note about Ad Placement Algorithms: If Barry is correct in his assessment, if Google did target made-for-advertising (MFA) sites with the March 7 Fred update, this is not the first time they have gone after these kinds of sites.  There have been several major MFA updates or spam operations in the past.  The Page Layout Algorithm is perhaps the most notable such action in the past, although Web marketers had mixed reactions to that one.  Matt Cutts (then head of the Google Web Spam team) finally disclosed in a video that if you had advertising covering about the same area as 2 Post-It Notes side-by-side at the top of your page, that could trigger the Page Layout Algorithm, which demoted your site (page) in the search results with some sort of negative ranking score.

And let’s talk about “thin content”: In 2014 Google’s Web spam team hit a group of aggressive affiliate Web marketers (many of whom had been coached by a specific marketing expert) with “thin content” penalties.  One specific message some of them received in Google Search Console read:

Thin content with little or no added value  (Affects: All)

This site appears to contain a significant percentage of low-quality or shallow pages which do not provide users with much added value (such as thin affiliate pages, cookie-cutter sites, doorway pages, automatically generated content, or copied content).  [Learn more]

My partner Randy Ray and I looked at several of the sites that had been hit.  One site was, in our opinions, an adequately written and thoughtful Website about a very specific hobby, although we did not feel it was high quality content.  But the site was just absolutely annoying to read because of all the affiliate links.  The other sites we looked at used vague, ambiguous content, generic pictures, and were generally uninformative about their topics.

Curiously, the coaching expert concluded that the penalties were applied because the sites used Private Blog Networks (PBNs) for links.  That seemed unlikely to us because Google tells you when it objects to your links.  There was a wave of PBN de-indexings around early September 2014 that hit that segment of aggressive affiliate marketers very hard.  But some of the “thin content” warnings looked very different from the above warning.  Here is an example:

Google has detected that some of your pages may be using techniques that are outside our Webmaster Guidelines.

As a result of your site having thin content with little or no added value, Google has applied a manual spam action to [DOMAIN].  There may be other actions on your site or parts of your site.

Soon after the thin content penalties Google rolled out Penguin 4.0, which was released in late September 2014.  It’s possible the PBN networks were hit by a late version test of Penguin 4.0.  Either way, the loss of link value from de-indexed PBNs should not result in manual penalties for “thin content”.  The sites would have lost rankings from any links that were helping being terminated or turned into negative value by Google’s Web spam team or Penguin.  But that second “thin content” penalty ends with an ominous “there may be other actions on your site or parts of your site”.  The key takeaway here is that a thin content penalty may be combined with other penalties, but Google considers thin content to be the deeper sin.

In retrospect it looks like that group of affiliate marketers was hit by a perfect storm: their sites were penalized for publishing thin content and their PBNs were destroyed.

What Quality Signals Might Google Use to Gauge User Experience?

Unfortunately we still see many Web marketers, including some of the more notable personalities on the conference and video circuit, conflating “user engagement” with “user experience”.  These are two totally and completely separate concepts.  And “user engagement” is sub-divided into “on-site user engagement” (which Google cannot measure) and “off-site user engagement” (of which Google can only measure that which occurs on its own sites).  User engagement is not a quality signal.  Who clicks on a search result doesn’t in any way indicate whether the page satisfied the user; and one man’s “high quality content” is another man’s “this has nothing to do with what I want”.

User Experience occurs on the page.  It only occurs on the page, and it has nothing to do with how many people click on a link, view a picture, read a paragraph, or otherwise “engage” with the page components.  While there may be value in tracking how far down the page a user scrolls or which links the visitor clicks on, those are not quality signals.  People may scroll down the page to read all the text, to scan for something specific, or to see what the sales pitch is.  And they may click on navigational links because they love what they are reading and want more, realize the search engine sent them to the wrong page, or have decided to learn more about the site/company.  An action by itself tells you nothing.

So let’s look at the list of Google’s Quality Guidelines that Googlers keep associating with the March 7 Fred update:

Basic principles

  • Make pages primarily for users, not for search engines.
  • Don’t deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you, or to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
  • Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.

Specific guidelines
Avoid the following techniques:

  • [*] Automatically generated content
  • [*] Participating in link schemes
  • [*] Creating pages with little or no original content
  • Cloaking
  • Sneaky redirects
  • Hidden text or links
  • Doorway pages
  • Scraped content
  • [*] Participating in affiliate programs without adding sufficient value
  • [*] Loading pages with irrelevant keywords
  • Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware
  • Abusing rich snippets markup
  • Sending automated queries to Google

Follow good practices like these:

  • Monitoring your site for hacking and removing hacked content as soon as it appears
  • Preventing and removing user-generated spam on your site

So I have flagged [*] and italicized the items that are most likely to apply to aggressive affiliate marketing.  At this point we’re only concerned with what happened in March 2017, not August / September 2014.

I think it’s highly doubtful that March 7 Fred cares about links pointing to a site.  Penguin will take care of a lot of bad links.

And whenever a Googler has singled out “irrelevant keywords” in my experience they usually reference lists like this:

  • viagra
  • phenergan
  • cheap car rentals
  • cheap car insurance

On sites where I see this kind of content the keywords are either thrown in as long lists (sometimes with hundreds of words and phrases) or they are randomly injected into text that may itself just be nonsense.  Aggressive keyword stuffing is nothing like the usual affiliate marketing “we made this page about [YOUR KEYWORD HERE] so that you would come and visit us when searching for [YOUR KEYWORD HERE].  Any site that talks about [YOUR KEYWORD HERE] should tell you some vague and generic stuff about [YOUR KEYWORD HERE] so that we can make lots of money.”

A good affiliate marketing Website is hard to build.  You want to target as many keywords (queries) and topics as possible and you want to get the sites out as quickly as possible.  There are online services that do all this for you.  Pay them $24-100 per month and they’ll build the sites.  You just give them your Amazon ID (or whatever affiliate program you’re using) and they do all the work for you.  What could possibly go wrong with that plan in 2017?

Although there is currently a pretty substantial market in autogenerated content not everyone who invests in aggressive affiliate marketing is using those services.  Or else they are paying for “premium” content, which is usually hand-written by low-budget freelance writers whose command of American English is, shall we say, limited.  If you’re running US Websites that are targeting US consumers you had better be using content that is written for American visitors.

There might be a connection between poorly written English text and algorithms that look for keyword stuffing.  How is a mere algorithm supposed to distinguished between chopped up English sentences written by an overseas freelancer and chopped up English sentences that are injected with random keywords?  The semantic analysis required to distinguish between those kinds of content is not simple.

Where you may be able to get by with poorly written content is by ignoring monetization.  You’ll be less likely to focus on specific product-related keywords.  You won’t be using “SEO optimized” keyword sub-headings.  You won’t be linking to affiliate product pages with targeted keywords.

Today’s Affiliate Marketing Industry is Its Own Worst Enemy

Every year we see the affiliate marketing forums and social media communities descend into panic as their latest methods are called into question by search engine practices.  You can almost clock these meltdowns by the arrival of new upsells from affiliate marketing gurus who have invested 1-2 months in figuring out what went wrong with their last campaign.  They cannot wait to get their ideas to market because if you don’t strike while everyone is in a panic sooner or later someone else will figure out what little tweak to make and sell it until the search engine catches on.

These are the people who cannot let go of their autogenerated content.

These are the people who won’t publish a blog post without loading it up with links to their programs.  Yes, it’s an income.  But even Jeff Bezos and Warren Buffett occasionally publish thoughtful opinion pieces that don’t in any way promote their businesses or products.

Affiliate Marketing is Not Search Engine Optimization

That word “optimization” keeps finding its way into a lot of conversations.  You may be good at monetizing Websites but if you keep getting penalties you suck at Search Engine Optimization.  For that matter, you suck at Search Engine Optimization if you seriously think “SEO optimized” means anything other than “search engine optimization optimized”.

If your idea of SEO is to grab as many links as you can (by whatever means possible) you suck at Search Engine Optimization.

If you think domain authority predicts anything useful you suck at Search Engine Optimization.

There is no optimization in taking one idea and beating it to death time and time again and then complaining that Google must be doing something wrong when it catches on.

Optimization is an important word.  You should understand what it means.  If you’re not interested in using a resource as effectively as possible then you’re not optimizing.  You’re just using a word blindly.

The tradeoff between short-cut Web marketing and Search Engine Optimization is that you sacrifice the optimization for the sake of gaming the search engine.  You can play the long game or the short game.  The long game optimizes.  The short game does not.

So How Should Affiliate Marketers Optimize Their Websites?

Let’s start with the Product Reviews.  These are the meat and bones of affiliate marketing.  Say something substantial.  Say something definitive.  Say something opinionated.  Make the opinion the point of the review.  A review that offers no opinion is not a very useful review.

If you ask your next door neighbor what they think of the new grocery store, how do you think you’ll feel if they respond by saying, “Well, SuperSavers Grocery Mart operates 500 stores in 23 states.  They carry fresh produce from 5300 local suppliers and their in-store inventories represent on average 12,300 name brand products.  Your local SuperSavers Grocery Mart is open from 6 AM to 12 Midnight.  They accept credit cards and cash payments.  They also have a special program for EBT card holders and discounts for senior citizens.”

Thank you very much, neighbor, I could have read that on the company’s Website (and probably more concisely).  Search engines are not perfect but sooner or later, as real information and opinions are published, the SERPs change.  And eventually the Web spam teams catch on to the PBNs, the autogenerated content, and the other tricks of the trade and they do something about them.  Don’t you ever get tired of building new Websites every few months?

Another thing affiliate marketers can do is share some real data.  Now good, interesting, useful data is hard to come by.  You don’t need to populate every page with data.  God, I hope the days of injecting pages with local community demographic data are over but probably not.  What I mean by “share some real data” is provide some personal anecdotes.  In other words, if your reviewers are not using the products and services they should be.  Opinions don’t have to be based on real experience (and most opinions probably are not based on real experience) but in my opinion the best-informed opinions (even the ones I disagree with) are based on real experience.

A search algorithm may exist that can figure out which opinions are bullshit and which are real.  But if there is no such algorithm yet there will be.  You can bet on that.  So why wait until everyone is complaining that vaguely written content is no longer sufficient?

And let’s talk about advertising.  There is darned little information in Google’s quality guidelines about user experience.  Virtually everything in the guidelines is about user experience but they don’t just come out and say, “Doing this creates the kind of negative user experience that we try to identify algorithmically”.  To Google’s credit they do preface their list of quality guidelines with a couple of paragraphs that include this sentence: “Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.”

How does advertising affect user experience?   And what, exactly, is, advertising?  Here is an advertising link:

Subscribe to the SEO Theory Premium Newsletter.  Ever since 2012 our subscribers have read in-depth case studies that examine page and Website quality issues.  We have provided clear, explicit lists of suggestions that Webmasters can use to improve the quality of their content and their Websites.

If you see occasional banners embedded in the articles on this site, those are advertising links, too.  The search engines can see and follow some of our advertising links.  We block the search engines from following other advertising links.  We don’t follow a specific formula.  Instead we sit back, let our memories of what changes we have made fade for a while, and then we come back to our sites and look at them with a fresh perspective.  Just how much annoyance are we willing to tolerate while reading our own content?  We’ll tolerate less than that from someone else’s site, so we try to reduce our self-annoyance on our own sites.

Advertising may pay the bills.  It may just offset some of the bills.  It’s not a bad thing.  But do you really need to embed a big BUY THIS PRODUCT NOW call to action in-between every paragraph?

The difference between a site like Amazon and a typical affiliate marketing site is that a visitor to Amazon can learn something useful without ever buying a thing.  Amazon is the Web’s number 1 source of information about millions of consumer products.  You get real information there.  And all Amazon really is, folks, is an oversized affiliate marketing company.  Yes, they have warehouses and fulfill orders directly but their Amazon Marketplace offers a lot of products that are shipped directly from the providers, not from the Amazon warehouses.

Amazon is the world’s number 1, largest, most successful, highest performing affiliate Website.

If you cared about your affiliate sites the way Amazon cares about theirs, you would probably have fewer complaints about Google and your revenue streams would probably be smoother and more dependable.  You don’t have to be like Amazon. You just need to care about each site you build as if you were Amazon.

Amazon is all about the user experience, and I say that as an Amazon customer who has scrolled past their promotional widgets dozens of times per day.  Yes, I know the cross-selling is endless on Amazon.  But they still don’t have a Google problem.

You need to bend it like Bezos if you want to optimize for search in affiliate marketing.  Otherwise, you’ll be buying the next quick fix for the latest Google crackdown on cheap affiliate marketing sites for the rest of your marketing career.  Is that really how you want to do business for the next 10-20 years?

via SEO Theory Blog Read More…

Leave a Reply