A significant Google algorithm update happened on June 25. This update has had webmasters and SEOs buzzing all week about the significance of its effects on websites across the world.
Rather than succumbing to the information overload that comes with frequent “quality updates” from Google and forgetting about the other times that you saw the same problems, let’s cover the steps that turn a reactive or response into a proactive plan that can help your struggling online presence recover for good.
Google Algorithm Updates Will Only Continue Coming (As They Have for Years)
It’s no secret that Google updates its algorithm often, and based on what we’ve seen since 2000, it will likely continue to do so for years to come.
If you’re noticing the reduction of impressions in Google Search Console over the default view of 28 days, expand the range to 90 days.
If you’re examining site traffic in Google Analytics across the week, expand to a full historical view. If you have access to Google Analytics data that is filtered for your IP and your historical view doesn’t go back far enough, switch to the unfiltered master view.
If you were hit by Google’s algorithm update this week, there’s a chance you’ve been hit before. You may not even realize it.
Why Switch to an Unfiltered View?
Switching to an unfiltered view isn’t exactly the most scientific approach as it will show site visits by employees, webmasters, and those who provide digital support, which tends to be omitted in an IP filtered view when done properly. This will largely depend on your site’s general volume of traffic and weighed against how many daily and monthly visits you believe those visitors generate.
If your site receives tens of thousands of sessions a week and filtered sessions are in the single or double digits during the same time frame, the numbers won’t be exact, but you can still assess the historic damage with minimal variance.
What You Should Look For in Historic GA Data
In short, you’re looking to find evidence of other events similar to June 25 that may have been overlooked. A simple way to handle this includes:
- In GA, navigate to Acquisition > All Traffic > Source/Medium.
- Extend the time frame from the default seven-day view to as far back as you can, preferably before 2015. I’ll explain later on.
- Under the Source/Medium columns, select google/organic.
- Look at the line chart in a Weekly or Monthly manner, especially if you’re working with years of data.
Let’s use a test site of mine to uncover what a long, slow decline looks like if left untreated:
From here, you should have a pretty good glimpse at how your site has performed on Google Search. Now, you’ll need to ask yourself a couple of questions:
- Where are the dips in traffic? Any gains? When did the decline start?
- Can those be explained by seasonal factors or poor tracking issues?
- Are there any annotations that can provide a better understanding? Often these can answer questions that you may not have considered if they cover site maintenance and migrations. If you’re lucky, you may even have a digital record showing correlation between performance and Google’s algorithm updates.
At this point, you have a number of options. One of my favorite things to do after an algorithm rolls out involves looking at the site’s top landing page performance before and after an algorithmic event.
What Does Panda 4.2 Have to Do With Google’s June 25 Update?
In short, Google’s Panda 4.2 refresh began slowly rolling out on July 18, 2015, taking weeks to fully go into effect. At the time, Panda was the name for Google’s algorithm that targeted sites of “low quality” content and rewarded sites that adhered to the search engine’s site quality recommendations.
Many of the points that Google made regarding Panda’s intent in 2014 are still focused on in 2017, most notably, with the March 7 rollout of the “Fred” algorithm update.
Here is the same site’s organic performance on Google, comparing “Fred” to this week’s update on a daily basis:
The Next Step is Admitting You Have an SEO Problem
Being able to accurately identify the root causes plaguing your website can seem like a scary proposition for some.
Using the scientific method can help you figure out whether you have a big SEO problem.
For those who have left their memories of childhood science fairs at the door of adulthood, here are the steps along with a general example for each situation:
Make an Observation
This is where many of us were at just a few days ago. For others, this might not have happened yet! One example of an observation I’ve seen across the web has generally followed the format of “My organic traffic on Google is down!”
Ask a Question
After identifying the downward trend, you may be asking yourself “Why did my organic traffic take a hit?” This is where most tend to stall, unsure of where to go or how to proceed. For SEOs, there’s a good reason for this…
Form a Hypothesis
The reason why many digital marketers and webmasters never reach this step is that when it comes to handling “The Google Dance”, it’s easy to get overwhelmed by the sheer volume of ranking factors that come with the territory. However, by taking a step back and reviewing your site’s historic performance and comparing them to any changes that have been made on your site, you can make the case that “Turning hundreds of pages with thin content into ones that speak to the intent of each page will restore our site’s previous rankings.”
Because this is a cause and effect relationship, be mindful of your variables – the aspects of your site that you’re changing. If you aren’t familiar with the site or if your experience in handling general website optimization efforts is minimal, you may want to control your other variables to ensure that any other changes outside of the ones stated in the hypothesis don’t turn your poor rankings into non-existing ones.
Make a Prediction
“I predict that if I turn my site’s thin pages into vibrant pages that people want to read, share, click, and convert on, then my rankings will return.” Easy enough, right?
Conduct an Experiment
Now, this is where we turn a good idea into action. For this example, identify site pages that you believe are the source of your traffic (and rankings) issues identified and also confirm that if those pages are to be updated, that other unaffected pages won’t be next as a result. It needs to be said that if you’re going to write great content, you should know how Google defines “great content”.
If all goes well, you stand to see your site return to its former glory or even better, have it reach newer heights!
If this doesn’t affect your site at all, you may have other issues at play such as over-optimized anchor text or poor mobile experience, which means you’ll need to return to the hypothesis drawing board.
Since you’ve produced content that marketers dream of, this shouldn’t be a detriment once you begin your next experiment.
The More Things Change, the More They Stay the Same
One piece of advice I was given early in my professional career came from my first agency mentor, who recommended that I focus less on the algorithms and more on the path towards consistently improving on the smallest of every detail.
With Google’s algorithms, there are no easy fixes for a poor showing. SEO success takes effort.
Your website is a single entity composed of hundreds of interconnected parts and numerous off-site accessories that if focused on individually using a detailed plan of action backed by tested best practices and continuous research, will add up to success in the long run.
Screenshots by Beau Pedraza. Taken June 2016.
via Search Engine Journal Read More…