How to Recover from a Google Algorithmic Penalty Fast Using TRAP

On August 1st 2018, Google launched one of the most aggressive algorithmic changes since the notorious 2013 Google update, “Penguin”.

The August update came like any other Google update, no warnings and lots of speculation. Overnight, some of the most affected niches (finance and health) saw up to a 50% loss in traffic and rankings.

Many affiliate marketers who rely on highly competitive organic rankings on a national level had to scrap their websites and start all over again.

This is the 3rd major algorithm update I’ve been through at Fannit. One of the more frustrating things about this update was the fact that both white hat and black hat SEO’s saw their websites affected (or not affected) in the same ways. How frustrating!

To make matters more complicated, Google announced and responded to webmasters with similar language to their infamous Penguin update..

In summary, they said that there’s really nothing webmasters can do about the update and If your site has been devalued, just try and do better next time. Which is to basically say, “You suck! You do better next time!”

And worse, they didn’t even spell benefitting correctly!

google penalty recovery services

Plainly, SEO’s lost their shizzle. 

Tip: Top SEO Predictions for This Year

Who were the biggest losers?

The broad core algorithm update definitely seemed to devalue some specific niches more then others. The data shows the highest affected sites are those within the category of Your Money or Your Life websites (YMYL) and how well their expertise, authority, and trustworthiness (EAT) were scored.

google penalty EAT

What is EAT?

Expertise, Authority, and Trustworthiness are the standards by which Google judges whether a website is worthy of being ranked for a topic.

Basically, high quality pages get the rankings while low quality pages don’t. Here’s some of the factors that can affect your EAT:

  1. Low quality content - not written by an expert.
  2. Duplicate content - copied from another expert’s website.
  3. Short content - length can also indicate to Google that an article is not exhaustive enough to solve the reader’s problem.

What is YMYL?

Your Money or Your Life are defined by Google as highly impactful life events centered on money. Here’s how Google breaks this down:

your money or your life penalty recoery
  1. Pages that focus on financial or shopping
  2. Sites that provide financial information that’s sensitive like taxes or investments
  3. Articles that provide information on medical conditions or diseases & how to assess and treat them properly.
  4. Legal themed pages that offer advice on personal injury, family law, or other sensitive legal issues.
  5. Pages that offer advice on home or car repair that could potentially be disastrous to the quality of life of a person if the incorrect information is given.

I’ve taken this graphic from Search Engine Journal here, which effectively depicts the top niches affected.

This graphic shows the ranking changes before and during the Google update. The creator of this graphic, Mordy Oberstein from RankRanger thinks that this data shows movements on a scale never seen before from a previous Google update.

august 2018 google algorithm update

SEMrush Sensor was also off the charts with a ranking of 9.4 on the day of it’s change. Some of the top affected niches were the health, finance, auto, and fitness niches.

What can you do?

Well, Google has one piece of advice... “Do better next time!”:

Wouldn’t it be nice if massive world problems could be solved with advice like this?

I’d receive the parent of the year award if I gave this advice to my kids every time they failed, “Son, just do better next time”! Nailed it!

Seriously, how do we take such nebulous advice to practical application? Well, let’s understand a little bit more about what Google wants for its users.

The Road to Recovery: Long Term + Quality and Relevance

If there’s one thing that Google continues to clarify in every update, it’s that quick SEO wins without a solid foundation of long-term quality & relevancy goals will not win in 2018 onward.  Black hat tactics will always be around, but Google raises the bar of quality every year making black hat SEO more and more difficult.

On my whiteboard I have two quotes for the week:

“Success rides on the wings of consistent discipline.”

"Vision without execution is hallucination.” - Thomas Edison

A lot of SEO’s spend their time hallucinating about the results they wish they had as opposed to actually creating a strategic plan and executing that plan in consistent discipline.

The key to recovery is sticking to a methodology that produces quality and relevance to Google over a long term plan.

Tip: How you should hire a marketing agency

I call this methodology TRAP. I learned the concept of TRAP from an SEO buddy of mine named, Stephen Kang. He’s the founder of the popular Facebook group, “SEO Signals Lab” and has done awesome things for the SEO community.

Here’s how the acronym of TRAP breaks down:

  • Technical - Can Google crawl my website?
  • Relevancy - Is it crystal clear what my website does to help searchers?
  • Authority - Is my website backlink worthy and authoritative?
  • Popularity - Is my website popular on social channels & PR?
Tip: Check out the On-page SEO Cheat Sheet for Better SEO

Technical - Start with the foundation

Technical is the first place we start when looking at a website. The goal is to remove any blockers inhibiting Google from crawling a website.

Check Devaluation

Devaluation is a signal that your website has some serious on-site technical issues. Here’s how you can determine if your site has been devalued with “site:” operator search.

When doing a Google search for your brand you should see your website show as #1. If this is not the case you’ve most likely been hit with an algorithmic penalty.

how to recover from penguin penalty

Using the site operator search, you can go a step further by searching “site:yourdomain.com” in Google. The first results should also be your home page.

website penalized checker

Doing these searches helps you determine if you have a domain or page devaluation. Using a Google search of a paragraph of text within one of the pages of your website can also help you find a site devaluation. Simply copy a paragraph of text from your one of your pages.

avoid google penalties

Then, paste that content as a search in Google. You should see your website popup as the first results. If not, you may be dealing with a penalty.

panda and penguin penalty checker

Check HTTP/HTTPS Conflicts

We also want to be sure there’s no conflict between HTTP/HTTPS. This means that Google should only be indexing the HTTPS version of your website. To check your site use this command: site:domain.com -inurl:https.

Here’s a couple of plugins you can use with WordPress to to properly 301 redirect from http to https.

  1. Really Simple SSL
  2. HTTP to HTTPS remover

Crawlable Pages - Ensure Google Can Only Crawl What Matters

We want to be sure that unimportant pages are not crawled by Google. Think of this as having a library that’s always organized.

Keeping your content organized will ultimately improve your relevance by super niching your indexed content to the stuff that actually matters.

Here’s some examples of content you don’t want to index:

  • Categories (especially empty categories)
  • Login pages
  • Search result pages internally
  • WordPress image pages
  • Product review forms
  • Checkout & cart pages
  • Product comparison pages
  • Session ID’s
  • Header footer pages (on some WP front end editors)

You can find these pages by doing a site operator search as follows, “site:domain.com inurl:review.”

Crawl Errors:

The next place we want to look at is a programmatic crawl of the website with SEMrush, Screaming Frog, Agency Analytics, and Search Console.

google penalty checker

SEMrush is a paid service that provides an accurate view of technical crawl issues. We also use the paid version of Screaming Frog and Agency Analytics to verify the reported errors from SEMrush.

After compiling all of the data, we can set priority on what items to fix. Not all items are going to be fixed as we will manually check each item and qualify them as a problem/no problem.

duplicate content google penaltyOther crawl errors that we would check within Search Console include, 404’s, Structured Data, and the trend of Google daily crawl over the past 30 days. It’s important to check the structured data. Errors caused from plugins or improperly placed structured data may be happening which throws an error to Google affecting crawlability.

Page Speed

While running our programmatic check we’ll head over to GTmetrix and check out our site speed.

google pagespeed insights

We want to be sure that our Google PageSpeed score is minimum 80+ with a page load time below 3 seconds. Ideal target is 90+ with a page load time of 2 seconds or less.

Faster websites promote easier indexing and improved user behavior which has an indirect effect on search engine rankings. If you believe your website has been devalued, fixing your website speed can have a drastic impact on your rankings.

This score is an A+:

pagespeed insights gtmetrix

Next we want to check the quality of our content which is the relevancy step. I’ll be covering how you can quickly QC your site content to optimize your relevancy signals in part 2 of “How to Lift an Algorithmic Penalty Using TRAP.”

Relevancy - Use the Right Keywords

After doing comprehensive keyword research, you'll need to ensure you're avoiding thin and duplicate content.

Hummingbird and Panda can be your nemeses when it going up against an algorithmic devaluation. This is not easy to accomplish either. One of the biggest areas that has the most challenge is the eCommerce niche. 

Thin Content

To determine if you have thin content you'll want to use the tool Screaming Frog or Siteliner.com to crawl your entire website. I prefer Siteliner, but can be a tad expensive if you're crawling more than 250 pages. Anything under 250 pages is free!

You'll want to find all pages on your site that have less than 1000 words and do the following:

  1. Bulk up ALL content that's less then 1000 words
  2. No index any non-important pages that don't pertain to subject matter
  3. Use a canonical tag for similar product pages and bulk up the main product page (for eCommerce)

Here's an example screenshot from Siteliner showing which pages we'd want to fix:

fix duplicate content

Now that we've identified thin content, we need to understand why that content is thin. It's common for these crawlers to report a lot more words on a page because they are literally crawling the header, footer, sidebar, and paragraph sections of that page.

That's probably somewhere around an extra 200-300 words of content per page, which means you're probably going end up closer to 700-800 words of paragraph content on every page (not 1000). 

Duplicate Internal Content

The next thing we want to look at with Siteliner.com is the amount of internal duplicate content. This is a massive infraction that's probably one of the most overlooked errors with on-page SEO. 

The more common ways that duplicate content presents itself is through the copying of bullet points, paragraphs, about us sections, how to's, and call to action paragraphs.

The key is to reduce the amount of thematic duplicate content throughout your site by ensuring each page is at least 90% unique from other pages on your site. If your site has more than overall 10% duplicate content, you're going to need to do a deep dive. Here's some action steps you can take: 

  1. Run a Siteliner scan to find duplicate content pages
  2. Reduce page by page duplicate content to less than 10%
  3. Note similar thematic sections that you keep on duplicating across your pages and ensure your writers don't add those sections in the future.

We can see from the image below that this example site has a pretty high level of duplicate content across other pages on their website. 

We run an internal duplicate content audit every quarter for our clients to make sure we are finding and addressing any internal duplicate content issues quickly.

eCommerce duplicate content

For eCommerce this can be a big area of frustration, as you're trying to provide unique descriptions for products and there can be inevitable overlap.

I can't stress how imperative it is that you consider your methodology for providing unique well-written content on your product pages.

One of the biggest offender platforms is Shopify. The tabbed product descriptions found in many themes can often encourage multiple duplicate content issues across all product pages of a website.

External Duplicate Content

For external duplicate content we are going to use copyscape.com. You'll want to use the batch analysis tool for this audit. Grab the summery URL's that were crawled from Siteliner (download a .csv) and place those URL's into the batch analysis tool.

This is a paid audit and can become quite expensive. If you're looking to save a few bucks go ahead and whittle down the URL's that you want to examine to only the ones that are key for SEO.

Once you've ran the batch, you're going to need to download your results as a .csv file and then sort them by the highest relative risk of duplicate content.

You should get a view like this:

Now, take this data and either rewrite your content to ensure it's 90% unique or request that the webmaster who "borrowed" your content to take down their content.

Sometimes these pages/articles are doing nothing for you (badly written old articles). It might be best to simply take down the article and do a 301 redirect.

Internal Anchor Text Distribution

The final thing that we want to analyze for relevancy is your internal anchor text distribution. To keep from over-optimizing, we are just going to use the rule of 25, 25, 25, 25. 25% exact match/money, 25% random, 25% brand, and 25% related.

Use a tool like Screaming Frog to crawl every single link across your site. You're going to want to map out how these links are currently supporting your primary pillar/cluster and then make adjustments based upon your anchor text distribution.

Track everything within a Google Sheet and constantly update/track your work as you add more content to the site to keep yourself within compliance.

Now that our relevancy signals are squeaky clean we are going to audit the authority of our backlinks. This will come in part three!

Get great marketing tips from our blog via email!

Recent Posts

Keith Eneix

I'm the founding partner & CEO of Fannit Marketing Services. I started Fannit to help bridge the gap for business owners to move from owning a job to investing into their own business in a way that can create lasting wealth for them and their employees.