Ultimate Guide to WordPress Migrations and SEO: Pre-Migration Prep

Ultimate Guide to WordPress Migrations and SEO: Pre-Migration Prep

There are many reasons you might need to migrate a WordPress site – a change of domain name or perhaps moving a blog from a subdomain to a subdirectory. Whatever the reason, it is important to factor in SEO when considering a WordPress migration. But why, exactly, do you need to keep SEO in mind?

There are many signals that can affect organic rankings in search engines such as Google. These range from the text used on a web page, the inbound links from other sites, to the load times and optimisation of a page for mobile devices.

So in this article – the first in a two-part series – I’ll walk you through the ways in which WordPress migrations can impact your SEO, and the pre-migration steps you should carry out to minimize a hit to your organic search rankings.

How Does Migrating WordPress Affect SEO?

#1: Changing URLs

Search engines view different URLs as different pages, even if the content is identical. If the appropriate steps aren’t taken for URLs that change, then their organic rankings and the value of inbound links could be lost.

#2: Changing Content

When determining the relevance of a web page to a user’s search query, content signals are some of the most important. If you are using the migration as an opportunity to refresh your WordPress website’s content, any changes could have a positive or negative effect on your organic rankings.

#3: Changing Code

Part of the WordPress migration might involve changing the theme used on your website. This will change how the site is coded and could affect how well optimised it is for search engines. There may also be unforeseen issues introduced if the developer didn’t have SEO as a priority whilst building the theme, such as crawling issues or mobile device optimisation.

#4: Changing Performance

Page speed has been a ranking signal for many years now, and Google has recently been placing even greater emphasis on web page performance with regards to mobile devices. You can see this with developments such as AMP (Accelerated Mobile Pages). Your WordPress migration could have involved changing how the site is built, as well as its hosting. Both of these changes could have affected your website’s load times.

The WordPress Migration Process for SEO

So, you can see that a WordPress website migration could have an impact on organic rankings and traffic. But how can you ensure the best possible migration for SEO?

You can divide the process of the migration into two phases: pre-migration and post-migration. This is a two-part series, with this particular post covering the SEO tasks you should carry out before the WordPress migration. The second post is going to cover what you need to do once the migration has been carried out, including testing, to ensure that everything has gone according to plan.

Getting Started: Pre-Migration for WordPress

Testing Your Server

If you are setting up the new site on a staging environment for testing, then you need to make sure that search engines cannot crawl the test site. The easiest way to do this, is to block search engines using the robots.txt file.

This file should be placed in the root of your staging environment. For example, if your test site is on a subdomain, the robots.txt file would be located at http://ift.tt/2gx2KCi. You need to have the following directive in your robots.txt file to block search engines from crawling the test site:

Setting up Google Search Console and Analytics

You should make sure that you have Google Search Console setup, as well as an analytics platform. A popular and free analytics solution is Google Analytics. If you don’t have analytics setup for your site, do this now. It can provide a lot of insight into your visitors’ behaviour, acquisition and conversions. You’ll need these tools for pre-migration tasks and post-migration checks.

Identifying Any Current SEO Issues

At this point, you also want to identify any SEO issues your current website might have. This step is particularly important if either the theme is being changed and/or the content is being refreshed. By flagging issues early, you can ensure that the new site build and content avoids these mistakes.

If you have the budget, then you can bring in an SEO specialist or an agency to help you identify issues. If not, then there are a number of SEO tools, such as Woorank (free trial) and Onpage.org (free for 100 URLs), that can help you to identify common SEO problems.

Onpage.org

SEO Plugins Worth Checking Out

Is your current site using any plugins for SEO? If not, now is a good time to either start using one or to at least plan one to use on the new site after the WordPress migration. WordPress SEO plugins can help with a number of tasks, such as controlling which pages are indexed, XML sitemap generation, and editing metadata.

There are a number of SEO plugins available. Here at WPMU DEV, our SmartCrawl plugin is a great option for helping to drive traffic to your site. SmartCrawl features the ability to send sitemap updates to search engines, custom descriptions and titles, auto link keywords to any URL, built-in Moz integration and more.

Record Your Current SEO Performance

You’ll want to be able to see what did and didn’t work post-migration. To do this, you’ll need to know how your website performs before migrating your website. You can use both Google Search Console and your analytics platform to get a good idea of your organic search performance.

From Google Search Console, you will want to record the following metrics:

  • Search Analytics clicks
  • Search Analytics impressions
  • Search Analytics click through rate (CTR)
  • Search Analytics positions
  • Internal links

Search Console only stores Search Analytics data for 90 days. I often find this data is something I want to refer back to in the future. If you want to compare data over many months, particularly at a granular keyword and landing page level, the last 90 days of data might not be enough. Err on the side of caution and download your pre-migration Search Analytics data. You should download “Queries,” “Pages” by various combinations of devices and search types for the last 90 days.

Search Console Search Analytics download.

You should also download the pre-migration internal links report. You can compare this with the post-migration data if you are having any issues and believe internal linking may be the problem.

If you are using an analytics platform, try and have this collecting data for at least a few months prior to the WordPress migration. This should provide you with a reasonable amount of data to compare pre-migration and post-migration.

Checking Your Website’s Page Speed

Since page speed is a ranking signal for Google, you will want to keep a record of how well your site is optimized pre-migration. You can use tools such as Google PageSpeed Insights or GTmetrix to get an idea of how fast your pages load and where improvements can be made. GTmetrix actually provides both YSlow and Google PageSpeed scores, as well as storing registered users’ most recent page analyses.

Or better yet, you can use our free WP Checkup tool here at WPMU DEV to get an overview of your SEO, site performance and security.

You should test a variety of pages, such as the homepage, single posts, category archives, tag archives and static pages. Make a note of how long it takes each page to load, the number of resources required, the total page load size and what the recommended actions are needed to optimise page loads. You will want to refer back to this information during the post-migration phase.

WP Checkup is our free tool for testing your website's SEO, performance and security.
WP Checkup is our free tool for testing your website’s SEO, performance and security.

SEO

SmartCrawl is watching. Simply scan, list & fix

SmartCrawl helps you ensure your website is optimized for search engines and aid it in climbing towards higher search results ratings. Set global defaults and customize specific pages and post types to your SEO desire.

TRY SMARTCRAWL FREE
LEARN MORE

Create XML Sitemaps for Your Site

Ensure that your website will have XML sitemaps post-migration. These are useful for getting your content crawled and indexed by Google. If you are unsure of how to do this, simply install the SmartCrawl plugin for automated sitemap creation and the ability to send sitemap updates to search engines.

If you aren’t currently using XML sitemaps, then it is recommended you implement them during the pre-migration stage. This will allow you to test them in Google Search Console and ensure they are working without errors.

You can read Google’s documentation on XML sitemaps here.

Set up Redirects for URLs That Change

Depending on the size of your website, this can be one of the most time consuming pre-migration tasks. It is also one of the most important. By correctly redirecting all of the URLs that are changing during the WordPress migration you will:

  • Alert search engines to your content moving to new URLs
  • Ensure visitors clicking on inbound links get to a live page rather than a 404 error page
  • Have the best chance of smoothly transitioning your organic search rankings from the old URLs to the new URLs
  • Retain the value of inbound links to the old URLs

To do this effectively you will want to create a redirect mapping document. Open up your favourite spreadsheet program and create one column for Old URLs and one for the New URLs you will be redirecting to. You now need to identify all of the URLs that you will want to redirect.

When redirecting URLs for a WordPress migration, don’t just focus on the URLs that are live on the current site. You should redirect any URLs that have inbound links, even if they are URLs from previous versions of the site or they are pages that have been removed prior to the WordPress migration.

There are a number of sources you can use, some free and some paid. You should use as many as you have access to. The sources I use to identify URLs to redirect are:

#1: Search Console

Export Search Analytics Landing Pages

If you followed the section earlier regarding recording current SEO performance, you should have already downloaded your landing pages from Search Analytics

Note that Search Analytics will often limit the number of results to approximately 1000. If you have more pages than this, then you will need to use the Search Console API to get all of the URLs.

Export Most Linked Content

  1. Click Search Traffic
  2. Click Links to Your Site
  3. Under You most linked content, click on More »
  4. Export your most linked content by clicking Download this table
Search Console most linked pages download.

Export Internal Links

  1. Click Search Traffic
  2. Click Internal Links
  3. Export your internal link pages by clicking Download this table
Search Console Internal Links download.

Export Crawl Errors

  1. Click Crawl
  2. Export your crawl errors by clicking Download
Search Console Crawl Errors download.

#2: Google Analytics

Export Landing Pages

  1. Click Behavior
  2. Click Site Content
  3. Click Landing Pages
  4. Set the start and end dates to cover the entire time that you have collected data
  5. Scroll to the bottom of the page and set Show rows to 5000
  6. Scroll to the top of the page and click on ‘Export’ and then select the file format you want

Note that if you have more than 5000 landing pages, you will need to paginate through each batch of 5000 and export each set of results.

Google Analytics Landing Pages export.

#3: Inbound Link Monitoring Tools

I generally use Ahrefs and/or Majestic for monitoring inbound link data; however, there are a wide range of tools out there for you to use.

Export Landing Pages From Ahrefs

  1. Search for your domain
  2. Below the Pages in the left-hand menu, click Best by links
  3. At the top right of the page click Export
  4. Select Full Export
  5. Click Start Export
  6. The export will be available to download once it has finished processing
Ahrefs pages export.

Export Landing Pages From Majestic

  1. Select Historic Index
  2. Search for your domain
  3. Click Backlinks
  4. Click Export Data
  5. Click Advanced Report
  6. On the next page ensure Advanced Report and Domain are selected then click on Create Report
  7. On your Reports page, click on the domain you just created this report for
  8. Hover over Download Options
  9. Click on ‘Download Backlinks
  10. Click on Prepare Download
  11. Navigate to your Downloads page
  12. Click on the link Backlinks for {yourdomain.com}
  13. The data you will want to use in the export is the Target URL column
Majestic SEO export data.

#4: Screaming Frog

Screaming Frog is one of the most popular web crawling tools. There is a free version available that allows you to crawl up to 500 URLs. The premium version allows you to crawl an unlimited amount of pages.

Run a Site Crawl

  1. Open Screaming Frog
  2. Click Mode and select Spider
  3. Type your website’s URL into the input box at the top of the window and click Start
  4. Let the program run whilst it crawls your website
  5. Once it has finished, within the Internal tab click export
Screaming Frog

Once you have all of this data, you want to get all of your domain’s URLs that you have collected and place them into the Old URLs column in your redirect mapping spreadsheet. De-dupe these URLs and then you are ready to begin mapping your new URLs to the old URLs. Make sure you redirect URLs to either the URL that is directly replacing them or at least the most relevant content that remains on the site.

When the redirect mapping is complete you will need to consider how you will implement the redirects. The redirect can either be manually added to the .htaccess file or a redirect plugin can be used.

If you want to use the .htaccess file then there is a lot of useful information over on the Apache website. Generally you will be using the Redirect and RedirectMatch directives. For more complicated redirect matching, you will want to use Apache’s mod_rewrite.

A popular redirect plugin for WordPress is Redirection. This allows you to easily add redirects and provides a considerable amount of flexibility and control. You can read the documentation for this over on the official site.

WordPress Redirection plugin.

You do not want these redirects to go live yet, but you do want to have them ready for when you migrate your website. You can have the new version of your .htaccess file with the redirects ready to replace the pre-migration .htaccess file. If you are using the Redirection plugin, then you can add your redirects via the plugin and just leave them set as disabled until it comes time to migrate your site.

Please note, you should only be redirecting URLs that are changing. If you try to redirect URLs that remain the same, you will cause redirect loops. This will make the page unreachable until the redirect is removed.

Wrapping Up

That concludes part one of this WordPress migration SEO guide. At this point, your site should be ready to migrate, at least in terms of SEO.

In part two, I’ll be covering what you need to do immediately after the WordPress migration, how you can check if everything went correctly and on-going analysis you can do to make sure your WordPress migration hasn’t negatively affected your site’s SEO.

Related posts:

  1. 20 SEO Experts You Should Be Following to Stay on Top If you’re looking to seriously step up your SEO game,…
  2. 9 Obvious Things You Probably Don’t Know About WordPress SEO If there’s one thing you should include in your website’s…
  3. SEO Checklist: 4 Simple Steps to Get Your WordPress Site Ranking Higher Have you ever Googled your own website and been disappointed…
  4. Getting the Most out WordPress Tags for Improved SEO and Readability Vincent Van Gogh once said, “Great things are not done…

via The WordPress Experts – WPMU.org Read More…

How to Find Epic Keyword Opportunities That Turn Into Easy SERP Wins by @josephhhoward

How to Find Epic Keyword Opportunities That Turn Into Easy SERP Wins by @josephhhoward

“The best way to improve your online visibility is to write great content!”

We’ve all heard this before. In fact, we hear it over, and over, and over again.

Personally, I can’t stand when people say it.

Why? Because in my opinion, it misses the point entirely!

slipup

Are there advantages to writing great content? Of course.

  • Better conversion. The more engaged your readers are, the more likely they are to give up their email address, like your page on Facebook, or even buy your products or services.
  • Better time-on-page and lower bounce rate. The longer your visitors interact with your content, the better signals are sent to Google about how valuable your writing is to readers.
  • Build more relationships in your industry. When you write something terrific, people take notice. This could help you develop partnerships or interact with people who could help you grow your online business.

But when people tell you that great content is the be-all, end-all to ranking well in search engines, it bugs me!

  • Does that mean you can’t rank well if you’re not a world-class writer? The necessity to produce “great content” suggests that if you’re not one of the best writers in your industry or you don’t have the money to pay one to write for your website, you can’t compete. This couldn’t be further from the truth.
  • It’s about providing value, not “greatness.” People use Google to find a resource that answers their query. The focus of Google since its inception has been to answer searcher intent. That should be your prerogative and if a brilliant piece of writing comes from it, so be it.
  • The chase for greatness ignores low-hanging fruit. If you have a new website and want to write some “great content” to try to outrank a 5000-word guide on a DA 91 site, go right ahead. Chances are you’ll be up the river without a paddle. The trick is to find keyword opportunities you can actually compete for and write content that beats that of your pedestrian competition.

While writing terrific content has its advantages, it’s not the only way to win when it comes to gaining visibility in search engine results. Finding the right keyword opportunities means you only have to write content better than your average competitor, not the entire industry.

My company WP Buffs is a Domain Authority (DA) 19 site. Not very imposing.

domain authority

Yet we’ve managed to rank for some low- and medium-competition long-term keyword phrases. In this case, we managed to rank better than WordPress (DA 100) by optimizing for more niche search phrases.

ranking over wordpress org

Here’s the exact blueprint we use to find keyword opportunities that will allow us to increase our visibility in search results and win at Google.

1. Join All the Email Newsletters

First off, you’ll need to do some information gathering in your industry.

  • Do a quick Google Search for “[your industry] newsletter” and sign up for as many as you come across. It’s important to sign up not only for the ones that seem high-quality, but for ones that are a little rougher around the edges. Remember — we’re looking for keyword opportunities, which means most will come from smaller or less refined websites.
  • Set up Google Alerts for your industry. Because Google indexes every website and webpage across the internet (unless the website owner tells it not to), this will allow you to capture every new article in your industry regardless of whether or not they send out a newsletter.
  • Sign up for Unroll.me to combine all your newsletters and Google alerts into one daily email. This will help you prioritize the important items in your inbox and review content when you decide it’s time.

We’re signed up for every WordPress newsletter on the web. We get tons of emails every day about what our SERP competitors are writing about, and it all comes in one, tidy email every morning.

unroll.me

2. Use SEMrush to Find Long-Tail Keyword Searches

When reviewing newsletters or Google alerts, take every article you find and submit it to SEMrush. It’s great for competitor analysis and keyword research.

When looking over what our WordPress competitors were writing about, we found these two articles:

  • Better Media Management With WordPress Real Media Library: http://ift.tt/2fiP4ws
  • Customizing the WordPress Media Library: http://ift.tt/2ffI84s

We used these articles, along with every article our team read, and used SEMrush to do a bit of competitor research.

There are a few ways we can find keyword opportunities from these articles:

  • Under Domain Analytics > Organic Search > URL, you can find out exactly what keywords this article is ranking for.
  • Under Keyword Analytics > Related Keywords, you can search for related keyword phrases to those in the article title. In this case, we did a search for “WordPress media library.”

Remember — what you’re looking for are long-tail keyword phrases, or searches that contain a string of words instead of just one or two.

By using the related search feature in SEMrush, we found the keyword phrase “WordPress digital asset management” along with a few others.

keyword found

Note: The free version of SEMrush is terrific for base-level research, but if you want to dive even deeper, you may have to sign up for their introductory plan. They offer a seven-day money-back guarantee, so feel free to sign up, play around with it for a week and decide if it’s something you want to use long-term!

3. Do Some Manual Research

Now that you’ve put together a solid list of long-tail keywords, it’s time to see where the best opportunities are for you. This entire strategy will only work if we can find real opportunities in search results to focus a piece of content on.

Before we dive into search results, get the Mozbar Google Chrome plugin. This will allow us to see DA, Page Authority (PA), and links back to a website directly from Google search results! You’ll also need to create a free Moz account.

mozbar

Activate Mozbar and start manually searching Google using the long-tail search phrases you found using SEMrush.

  • Look for searches in which the top search result has a low DA. This is the easiest way to spot a search phrase that gets significant traffic according to SEMrush and provides a good opportunity for you to write content that takes over that #1 spot. A website with DA <40 is usually a great opportunity regardless of the DA of your own site.

squeeze page

  • Always remember searcher intent. If the top three websites have DA 100 but none of them answer the searcher’s intent, that’s also a good opportunity. You may end up ranking fourth, but you’ll get more people clicking through to your website than the three above since you answer their question and they don’t! That’s what we found with the search “WordPress digital asset management.” People would be looking for a more in-depth guide on the topic, not WordPress support.

dam search

  • Click through to websites to see what the content looks like. Often, you’ll find that the content that’s ranking is extremely thin and not helpful to searchers at all. That means an opportunity for you! The top-ranking page underneath the WordPress pages for the search “WordPress digital asset management” had 259 words. A 2000-word guide would trounce that!wordcounter

You can also find long-tail keyword phrases by allowing Google to give you suggestions.

  • Use Google’s autocomplete suggestions to add to your list of long-tail phrases. Google makes these suggestions based on the searches that get the most traffic, so they’re worth considering.

sej google autocomplete

  • Scroll to find more Google suggestions. Often, Google will include a list of related search phrases to the bottom of every page of search results.

sej related searches

4. Writing Tips

The entire point of finding epic keyword opportunities is so you can write something that’s better than an average piece of content but doesn’t necessarily have to be one of the best in your industry. This means that even if you’re not a world-class writer, you can compete in search results by doing a bit of keyword research beforehand.

This means that even if you’re not a world-class writer, you can compete in search results by doing a bit of keyword research beforehand.

That being said, putting together a solid article to help your piece of content rank well can still be challenging. Here are a few tips:

  • Write something long. Google likes long-form content that goes deep to answer a question. Spend three hours writing one 1500-word article instead of three 500-word articles. You’d rather have one piece of content ranking in the top few positions than three articles ranking on the second or third page.
  • Focus on how-to guides. These are the easiest articles to write and show your expertise in the industry. People are more likely to buy from you if they know you solve problems when you encounter them!
  • Focus on evergreen content. This means stay away from news and what’s hot right now. Stick to content that will be useful for searches in the next few years, not just the next few weeks.
  • Make sure your on-page content is fully optimized. That means page title, H1 header, image alt tags, social media tags, etc.
  • Getting backlinks will be helpful, but it’s not always necessary in this case. In this situation, we’re trying to outrank content that’s very average and doesn’t have any high-quality links. Writing long-form content and making sure your on-site optimization is on point should be enough.

Now it’s your turn! This strategy is useful regardless of how competitive your industry is or how authoritative your website is. We’re in the WordPress space which has a lot of major players, and we’re still able to compete in search results and hold our own. Give it a shot and stick with it and over time, you’ll start to see your organic traffic trend upwards.

 

This is my first contribution to SEJ, so if you’ve read this far, I would really appreciate some feedback! Please leave your thoughts in the comments below and let me know if this strategy worked for you too!

Image Credits:
Featured Image: Maarten van den Heuvel/Unsplash.com
In-post Photo: stevepb/Pixabay.com
Screenshots by Joe Howard. Taken November, 2016.

 

via Search Engine Journal Read More…

An update on Google’s feature-phone crawling & indexing

An update on Google's feature-phone crawling & indexing

Limited mobile devices, "feature-phones", require a special form of markup or a transcoder for web content. Most websites don’t provide feature-phone-compatible content in WAP/WML any more. Given these developments, we’ve made changes in how we crawl feature-phone content (note: these changes don’t affect smartphone content):

1. We’ve retired the feature-phone Googlebot

We won’t be using the feature-phone user-agents for crawling for search going forward.

2. Use "handheld" link annotations for dynamic serving of feature-phone content.

Some sites provide content for feature-phones through dynamic serving, based on the user’s user-agent. To understand this configuration, make sure your desktop and smartphone pages have a self-referential alternate URL link for handheld (feature-phone) devices:

<link rel="alternate" media="handheld" href="[current page URL]" />

This is a change from our previous guidance of only using the "vary: user-agent" HTTP header. We’ve updated our documentation on making feature-phone pages accordingly. We hope adding this link element is possible on your side, and thank you for your help in this regard. We’ll continue to show feature-phone URLs in search when we can recognize them, and when they’re appropriate for users.

3. We’re retiring feature-phone tools in Search Console

Without the feature-phone Googlebot, special sitemaps extensions for feature-phone, the Fetch as Google feature-phone options, and feature-phone crawl errors are no longer needed. We continue to support sitemaps and other sitemaps extensions (such as for videos or Google News), as well as the other Fetch as Google options in Search Console.

We’ve worked to make these changes as minimal as possible. Most websites don’t serve feature-phone content, and wouldn’t be affected. If your site has been providing feature-phone content, we thank you for your help in bringing the Internet to feature-phone users worldwide!

For any questions, feel free to drop by our Webmaster Help Forums!

Posted by John Mueller, Webmaster Trends Analyst, Google Switzerland

via Google Webmaster Central Blog Read More…

An update on Google’s feature-phone crawling & indexing

An update on Google's feature-phone crawling & indexing

Limited mobile devices, "feature-phones", require a special form of markup or a transcoder for web content. Most websites don’t provide feature-phone-compatible content in WAP/WML any more. Given these developments, we’ve made changes in how we crawl feature-phone content (note: these changes don’t affect smartphone content):

1. We’ve retired the feature-phone Googlebot

We won’t be using the feature-phone user-agents for crawling for search going forward.

2. Use "handheld" link annotations for dynamic serving of feature-phone content.

Some sites provide content for feature-phones through dynamic serving, based on the user’s user-agent. To understand this configuration, make sure your desktop and smartphone pages have a self-referential alternate URL link for handheld (feature-phone) devices:

<link rel="alternate" media="handheld" href="[current page URL]" />

This is a change from our previous guidance of only using the "vary: user-agent" HTTP header. We’ve updated our documentation on making feature-phone pages accordingly. We hope adding this link element is possible on your side, and thank you for your help in this regard. We’ll continue to show feature-phone URLs in search when we can recognize them, and when they’re appropriate for users.

3. We’re retiring feature-phone tools in Search Console

Without the feature-phone Googlebot, special sitemaps extensions for feature-phone, the Fetch as Google feature-phone options, and feature-phone crawl errors are no longer needed. We continue to support sitemaps and other sitemaps extensions (such as for videos or Google News), as well as the other Fetch as Google options in Search Console.

We’ve worked to make these changes as minimal as possible. Most websites don’t serve feature-phone content, and wouldn’t be affected. If your site has been providing feature-phone content, we thank you for your help in bringing the Internet to feature-phone users worldwide!

For any questions, feel free to drop by our Webmaster Help Forums!

Posted by John Mueller, Webmaster Trends Analyst, Google Switzerland

via Google Webmaster Central Blog Read More…

An update on Google’s feature-phone crawling & indexing

An update on Google's feature-phone crawling & indexing

Limited mobile devices, "feature-phones", require a special form of markup or a transcoder for web content. Most websites don’t provide feature-phone-compatible content in WAP/WML any more. Given these developments, we’ve made changes in how we crawl feature-phone content (note: these changes don’t affect smartphone content):

1. We’ve retired the feature-phone Googlebot

We won’t be using the feature-phone user-agents for crawling for search going forward.

2. Use "handheld" link annotations for dynamic serving of feature-phone content.

Some sites provide content for feature-phones through dynamic serving, based on the user’s user-agent. To understand this configuration, make sure your desktop and smartphone pages have a self-referential alternate URL link for handheld (feature-phone) devices:

<link rel="alternate" media="handheld" href="[current page URL]" />

This is a change from our previous guidance of only using the "vary: user-agent" HTTP header. We’ve updated our documentation on making feature-phone pages accordingly. We hope adding this link element is possible on your side, and thank you for your help in this regard. We’ll continue to show feature-phone URLs in search when we can recognize them, and when they’re appropriate for users.

3. We’re retiring feature-phone tools in Search Console

Without the feature-phone Googlebot, special sitemaps extensions for feature-phone, the Fetch as Google feature-phone options, and feature-phone crawl errors are no longer needed. We continue to support sitemaps and other sitemaps extensions (such as for videos or Google News), as well as the other Fetch as Google options in Search Console.

We’ve worked to make these changes as minimal as possible. Most websites don’t serve feature-phone content, and wouldn’t be affected. If your site has been providing feature-phone content, we thank you for your help in bringing the Internet to feature-phone users worldwide!

For any questions, feel free to drop by our Webmaster Help Forums!

Posted by John Mueller, Webmaster Trends Analyst, Google Switzerland

via Google Webmaster Central Blog Read More…

5 Types of Google Penalties (And What You Need to Do to Recover) by @IAmAaronAgius

5 Types of Google Penalties (And What You Need to Do to Recover) by @IAmAaronAgius

When webmasters fear getting Google penalties, most of them think of the dreaded algorithm updates: Penguin, Panda, the Top Heavy algorithm, etc. Get caught in one of these algorithm sweeps, and you could lose some or even all of your organic search traffic.

But Google actually has a lot more in its arsenal than just algorithms to encourage you to follow their Webmaster Guidelines. Ever heard of a Manual Action Penalty?

What’s a Manual Action Penalty?

Google has teams of search engineers tasked with the job of reviewing individual websites and, if necessary, assigning a rank penalty.

When Google runs Penguin, sites across the web can take a rank hit. A Manual Action Penalty means your site alone has been hit, and it’s your problem to fix.

Manual Actions are the most common type of Google penalty you can get. If you have one, you should see a message in search console about it:

image22

You can go to Search Console and check for yourself right now. Just go to Search Traffic > Manual Actions and see if you have a message:

image20

 

If Google has penalized your site with a manual action, you’ll see a message here describing the penalty. You can either get a Site-wide Match (meaning your whole site is affected), or a Partial Match, meaning only certain pages of your site are penalized.

Google Search Console Help provides a list of 12 common manual actions you can receive (meaning that there are more):

  • Hacked site
  • User-generated spam
  • Spammy freehosts
  • Spammy structured markup
  • Unnatural links to your site
  • Thin content with little or no added value
  • Cloaking and/or sneaky redirects
  • Cloaking: First Click Free violation
  • Unnatural links from your site
  • Pure spam
  • Cloaked images
  • Hidden text and/or keyword stuffing

Some of them, like the Pure spam or Spammy freehosts penalty, aren’t likely to happen to your average webmaster (unless you own a blatant spam site or host a lot of them).

Other manual actions, though, a lot of webmasters could be at risk for.

The good news is you can fix the problems Google’s engineers found on your site, and then request a review of the Manual Action in Search Console:

image30

 

Google’s engineers will review it, and if they approve it, they’ll remove the penalty and allow your pages to start gaining rank again.

Here are five common types of penalties my clients have gotten, and a walk-through of how we’ve helped their sites recover from each.

1. Unnatural Links

There are two different kinds of “Unnatural Links” penalties Google has:

  1. Unnatural Links from Your Site: You are hosting unnatural, artificial, or deceptive outbound links on your site.
  2. Unnatural Links to Your Site: You have unnatural, artificial, or deceptive backlinks pointed at your site.

These manual actions are in line with Google’s Penguin update, meant to penalize people who are participating in link exchanges, buying links, or selling them to manipulate rank.

If you have an unnatural links penalty, here are some examples of the kind of links you need to fix:

  • Paid links
  • Links acquired from link exchanges
  • Spammy guest post links
  • Automatically generated links
  • Spammy forum comment links
  • Irrelevant backlinks
  • Links from low-quality directory or bookmark sites

Clean up the Links From Your Site (Outbound)

Use a link analysis tool like Ahrefs or Majestic to get a list of your outbound links. SEOChat’s Website Crawler is another free option that will analyze 100 pages for you without registering.

Download a list of external links from the report:

image18

 

Identify any links on your site that are against Webmaster Guidelines. Once you find them, you can either:

  • Remove the links
  • Use a 301 redirect attribute through a page blocked by robots.txt
  • Set the links to “nofollow”

Clean up the Links to Your Site (Inbound) 

You can get a list of links pointed at your site using your backlink analyzer of choice, or you can use Search Console. Just click “Search Traffic,” then “Links to Your Site,” and you can download a list.

image21

 

Find any backlinks that are against Webmaster Guidelines.

Next, you’ll need to send out take-down request emails to the webmasters hosting them. If they don’t respond to you, then as a last resort, use Google’s Disavow Tool to tag the links so they don’t pass PageRank.

Once you’ve cleaned up your links, you can move on to submit a reconsideration request.

2. User-Generated Spam

If you’ve gotten a User-Generated Spam penalty, it doesn’t mean you’re a spammer – but your site users are. As far as Google’s concerned, it’s up to you to clean up spammy content people post to your:

  • Forum pages
  • Guest book pages
  • Blog post comments
  • Other site areas

Mozilla famously got penalized by this Manual Action a while back. Here are some user-generated spam examples from their site:

image27

 

If you haven’t already, the first thing you’ll want to do is install some kind of anti-spam software. Akismet is a popular WordPress tool that will detect and filter out some of your spam comments.

Hopefully, this does most of the cleanup work for you, but don’t stop there. You need to manually go through and remove any spam that got through the filters.

Look out for things like:

  • Posts that are blatant advertisements
  • Posts with gibberish text
  • Posts with off-topic links (Probably the most common type of comment spam I see)
  • Commercial-sounding content (Think payday loans, discount insurance, libido enhancers, etc.)
  • Auto-generated comments

You should also vet your user profiles, and delete any that might be spam accounts. These are usually auto-generated, have no profile photo, no description, and of course, post a lot of irrelevant comments.

A User-Generated Spam penalty is one you’re likely to get again and again, unless you take a hard line on spam from now on. To help my clients who have had this penalty prevent spam in the future, we:

  • Use a CAPTCHA on their sites
  • Change all forum links to “nofollow”
  • Allow users to report spam
  • Consider moderating all comments

image28

 

If you have a User-Generated Spam penalty, chances are you have a lot of comments and user accounts to go through. Neil Patel’s Quicksprout got this penalty several times and were faced with nearly 350,000 forum users to sift through.

But it’s worth it to be as thorough as possible, because if Google sees there’s still spam on your site, they’ll reject your reconsideration request.

In my experience, you have three options:

  1. Take the time to go through all your user-generated content yourself.
  2. Hire someone else to do it for you.
  3. Delete all your user-generated content.

In the end, number three is what Neil Patel did. The decision is up to you – if user-generated content is central to your site, it might be worth it to clean everything up.

3. Hacked Site

Some of my clients followed Google’s Webmaster Guidelines to a T, but still ended up with a manual action penalty because their site was hacked.

Hacked sites pose a threat to you and your site users, so Google wants you to clean things up.

image04

 

Keep in mind that if hackers are doing something malicious on your site, you might not even get a “Hacked Site” manual action. I’ve handled several cases where unknowing webmasters have ended up with a “Pure Spam” manual action instead.

Fixing a hack can be a big undertaking, but Google has some helpful resources for what to do.

Here are the basic steps:

Quarantine your site

You don’t know what’s happened to your site or if the problem is spreading.

So the first thing you want to do is take your site offline. You can do this by either stopping your web server or setting up a 503 response code.

You’ll also want to change all the passwords for your site, including system administrators, content management systems, FTP logins, and any other access points.

Identify the type of hack

Next, you need to figure out what kind of hack you have.

Search Console may have left you a message about it to go with your Manual Action Penalty. If not, move on to check the “Security Issues” section of Search Console.

image26

Here are some of the ways you can be hacked:

  • Spam – Someone’s adding spammy pages, links, or text to your site
  • Malware – Someone’s installed software to damage computers
  • Phishing – Someone’s installed software to collect information about your site users

Eliminate the vulnerability

Before you fix any changes hackers made to your site, you need to figure out how they accessed your site in the first place. If you don’t close this hole, they can continue to damage your site.

The problem could be a lot of things, like:

  • A virus-infected computer
  • Weak passwords
  • Out-of-date software
  • Permissive coding practices

If you aren’t comfortable investigating these possibilities yourself, bring a professional in to do it for you.

Clean up the hack

This is another job you’ll probably want a professional to do. They can remove the malware by hand and help you clean up your servers.

If the hacker got access to confidential user information on your site, you’ll have some legal responsibilities as well. Here’s a helpful resource on what to do in that case.

4. Cloaking and/or Sneaky Redirects

If you’ve gotten a manual action penalty for cloaking or sneaky redirects, one of these things happened:

  • Cloaking: You’re showing some content to Google but not to site visitors (either images or text)
  • Sneaky redirects: Your pages indexed in Google redirect users to completely different content

Here’s what I do with clients to help them recover from both.

Check for cloaking

To figure out what the problem is, use Fetch as Google. This tool will show you how Google sees your site.

Take the root address of affected pages from your Manual Actions report, and plug them in:

image06

 

Compare Google’s rendering of your page to how it appears in your browser. If there are any differences, fix them.

Most cloaking is deliberate, but if you aren’t sure why your pages look different, talk to your web developer, SEO agency, and anyone else who has access to your HTML to diagnose the problem.

Repeat the process, rendering different versions of your site pages, including mobile.

Check your redirects

Next, check your redirects using a tool like Screaming Frog. Their report has a “Redirect URI” column so you can analyze each URL destination.

image19

Look for any URLs on your site that redirect somewhere that site visitors probably didn’t want to go. Change these redirects to more relevant pages, or remove the redirect entirely.

Check for deceptive buttons, ads, and plugins

If you use an anti-hotlinking plugin to protect your images and bandwidth, Google could see it as cloaking. You may need to remove the plugin or disable the anti-hotlinking feature.

Also look out for any advertisements on your site that could trick people into clicking on something they wouldn’t have otherwise. These often look like trusted entities but are actually ads:

image31

Any of these things could be responsible for the manual action, so make sure you clean up as much as possible to make Google happy with your site.

5. Thin Content

Google wants to deliver a variety of quality options in search results. If your site is full of shallow, duplicate content, you could get a manual action to keep your pages low in rank.

Here’s what Google means by “thin content,” so you can evaluate your own pages:

Duplicate content from other sites

If you’ve taken content from another site and republished it on yours, this can be considered thin content.

Some webmasters scrape content from somewhere (like Wikipedia), make minor changes, and republish. If you sell products and copy a manufacturer’s product descriptions, that can also count.

image23

 

Thin content with affiliate links

If you have an affiliate site and publish content with the sole purpose of hosting affiliate links, you could get a thin content penalty.

Google wants to see that your content offers more value than what the original affiliate can provide. If your site is full of product descriptions and reviews from the merchant, you need to seriously improve your content.

Duplicate content on your site

Hosting a lot of identical or very similar pages on your own site can also land you a thin content penalty. I’ve seen SEOs come across this problem when they use doorway pages targeting different regions:

image24

 

Auto-generated content

Google sees auto-generated content as thin. This can include auto-translated text, automatically spun content, and text generated from scraping RSS feeds.

Once we’ve found all the potential thin content on my clients’ sites, here are the three options I give them for moving forward:

  1. Delete it. If you scraped someone else’s text or use auto-generated content, this is what you’ll need to do.
  1. Update it so it’s better quality. That includes rewriting product descriptions and adding substance to affiliate pages, and merging doorway pages and deleting unnecessary duplicate content.
  1. Remove it from search (adding noindex meta tags). If you have to keep some of your thin content pages, put this meta tag in the <head> section of those pages:<meta name=”robots” content=”noindex”>Or you can use a tool like SEO by Yoast to do this for you.

After you’ve done your best to clean up your site, you can submit a reconsideration request.

Submitting Your Reconsideration Request

Once we’ve done everything we can to clean up a client’s site and fix whatever caused Google to give them a Manual Action in the first place, it’s time to submit a reconsideration request.

To do this yourself, go back to Search Console and click “Request a Review.”

image29

 

Then you’ll see a box where you can submit your request.

When my clients reach this step, we try to be as specific as possible, including all relevant information about the cleanup process. Sometimes it’s easier to detail it in a Google Doc or Sheets file, then add that to the review request.

Relevant information for your review request might include:

  • A list of bad links you removed from your site
  • A list of spam comments you deleted
  • Details of your malware cleanup

Also be sure to explain how you plan to prevent the same problem in the future.

After you submit, you should get a confirmation from Google. Then hopefully, within a few weeks, you’ll receive communication that the Manual Action has been removed.

If you didn’t do a good enough job and your site’s still violating Webmaster Guidelines, Google will tell you to go back and try again.

The Big Picture

If you get a Manual Action Penalty from Google, it’s not the end of the world. Google created this system to give webmasters an opportunity to clean up their sites and be in line with Webmaster Guidelines.

It could be a lot worse – I know a lot of sites that lost rank from algorithm changes and never fully recovered, no matter what they did to fix the problem.

Just follow the steps in this post to overcome your manual action, and pay close attention to Google’s Webmaster Guidelines overall to avoid another problem in the future.

Have you had a Manual Action Penalty before? Tell me how you recovered in the comments:

 

Image Credits

Featured Image: Pixabay

Screenshots by Aaron Agius. Taken November 2016

via Search Engine Journal Read More…

Mobile First Index and 7 Things You Need to Stop Doing Immediately

Mobile First Index and 7 Things You Need to Stop Doing Immediately

Mobile First Index – Why is Google changing their index after so many years?

stocksnap_kqvvlizx6i

Google’s “desktop first” index has been around since the very beginning of this search engine. Why is it now getting abandoned and pushed aside as a backup? What is mobile first index and will the other search engines follow?

The fact that mobile searches have topped the number of searches from desktop devices in many countries around the world is nothing new. Google has been trying to make marketers, developers and business owners pay more attention to users consuming content on their phones ever since they announced ‘Mobilegeddon’ in February 2015, which favours mobile-friendly websites in their rankings.

Some website owners took the hint and invested in developing mobile-optimised and responsive websites, while others created “mobile websites” which would be served to users visiting on mobile device through a redirect. The latter solution often led to less than ideal UX, as the content would often be different that on the original desktop site.

As a result, users who click on the link from search results expecting a certain text from the search snippet will be disappointed when they won’t find what they were looking for.

That’s why Google has decided to change the point of view and start gauging the websites in their index with primarily mobile users’ interests in mind. In technical terms it means that the main user agent crawling the website will be mobile and the content that is accessible to the mobile crawler will be the one that will be considered for ranking.

Timeline of Google’s mobile-targeted actions

  • 26/02/2015 – Google announces change to mobile search results (‘Mobilegeddon’)
  • 21/04/2015 – Google rolls out Mobilegeddon
  • 05/05/2015 – More searches on mobile than on desktop in many countries including USA and UK
  • 01/09/2015 – Google warns to not use mobile interstitials / APP banners
  • 07/10/2015 – Google launches the AMP Project
  • 04/11/2016 – Mobile first index announced
  • 10/01/2017 – Google starts penalising mobile sites with interstitials

What do all these changes mean for us? If you are using mobile-friendly responsive website, probably not much. But even if your website is mobile-optimised, you can do more to get the most out of your SEO strategy. Here are seven things you should definitely stop doing immediately:

1. Stop redirecting based on the device

Having a separate mobile website is a legitimate strategy for many reasons, for example  if your mobile visitor persona is different to the desktop. However in combination with navigating to them through internal redirects based on the viewing device, it can cause a series of problems. To start with, it’s much harder for the mobile first googlebot to discover the desktop version of the site.

The better way to do this is to use rel:alternate tags and canonicals to map the desktop and mobile versions, as Google says that “we’ll continue to use these links as guides to serve the appropriate results to a user searching on desktop or mobile.”

2. Stop using different content on desktop vs. mobile website

It should be an absolute no-brainer that your desktop and mobile version of the same page should show the user the same content. It can be sized differently or lack some visual effects, but it’s important that all the content is the same. Using sneaky redirects for mobile users to show them different content from what the desktop users would see is one of the reasons why Google introduced the mobile first index in the first place.

3. Stop underestimating on page targeting

Mobile screens offer much smaller space to display our content compared to desktops, even if the size of our smartphones and phablets are getting bigger and bigger. This means that we need to get the most out of the available space in a search results page as possible. Mentions of terms in META data, headings, copy & structured data play an even bigger role in mobile search.

4. Stop ignoring structured data

As mentioned above, structured data markup can help make your search results look better, more engaging and thus increase the probability that users will click on your link. Google is constantly increasing the number of supported cases for Schema markup and rich cards, so even if you have ticked this item off your SEO to do list, it’s always good to keep your eye on new opportunities in micro-formats. See how you can easily generate the most common structured data formats here.

5. Start paying attention to local SEO

Mobile searches naturally have very often local intent – we search for restaurants or stores around us in certain areas etc. Whether you are a local business or just have locally-relevant content, it’s important you spend time making sure your page is optimised for local SEO.

 6. Stop being oblivious to your site speed

Site speed can influence your rankings for both desktop as well as mobile searches, but loading speed of page is a much more sensitive issue on mobile phones as the speed of connection usually tends to be slower. Regularly checking how long it takes to load and render your website goes a long way. Make sure you:

  • Avoid using unnecessarily big images
  • Leverage browser caching
  • Deliver your content from cookie-less domain
  • Move your JavaScript to the bottom of the page
  • Eliminate render blocking CSS and JavaScript from above the fold
  • Minify your CSS, HTML and JavaScript
  • Consider switching to newer HTTP/2 protocol

7. Stop failing to verify your mobile site on Google Search Console

In case you are using a separate mobile website you should consider creating a separate property in your Google Search Console account. This gives you great insight into how googlebot sees your website, and spot and fix any potential issues on your site that may affect the way you will rank on Google.

Whatever you do, as long as you make sure that your website offers relevant content in a speedy, non-laggy way – and you let Google know about it – there is no need to be worried. That’s why most of the recommendations above are focused on the user experience of your mobile users. After all, they decide what link they will tap on or not and that’s what the mobile first index is about: letting them pick from the best.

Sources:
http://ift.tt/2evK478
http://ift.tt/2fVKYcH
http://ift.tt/2eXfu6A
http://ift.tt/2gcUrwb
https://stocksnap.io/

Post from Pete Campbell

via State of Search Read More…

Yelp Users Can Now Check-in With Yelfies by @DannyNMIGoodwin

Yelp Users Can Now Check-in With Yelfies by @DannyNMIGoodwin

Yelp has launched three new features that lets users check-in with a photo, changes the way business images are displayed, and lets users share bookmark collections.

Here’s what you need to know about the three new Yelp features rolling out now for iPhone and Android.

1. Check-in With a ‘Yelfie’

What do you get when you combine Yelp with selfies? Apparently you get something called “Yelfies.”

Yelp announced that when your customers check in at your location, they can take a photo of your business. Once they’ve done that, the camera will flip they can then add a “yelfie,” along with your business name, and a rating.

Yelp heralds “yelfies” as a “fun way to capture your experiences with local businesses in-the-moment.” And Yelp has an 23-second explainer video to break down how “yelfies” works, just in case the concept is a bit too tough for you to grasp:

VIDEO

2. AI Helps Yelp Show Better Business Images

Now with 100,000 photos being uploaded to Yelp every day, Yelp’s photo team is turning to AI and machine learning to help identify the highest-quality photos to appear on your business page.

“Last year, we started by training a neural network to categorize photos,” according to Yelp. “Over the past year, we’ve done extensive evaluation and analysis to improve the quality of the photos shown at the top of each business page.”

Here are a couple before-and-after examples Yelp shared:

Yelp images before vs after

You can check out the technical details here.

3. Yelp Makes Bookmark Collections Shareable

In October, Yelp introduced bookmark collections. These let users curate and organize lists of businesses on Yelp.

Now Yelp has announced that users can share these bookmark collections with other users by tapping the “share” icon. Bookmark collections can be shared either as a link or a post on social media.

Users simply need to be given that link to follow a bookmark collection. That list also becomes part of their collection.

What do you think of Yelp’s new features?

Image Credits: Yelp

via Search Engine Journal Read More…