Before we became a marketing agency for small- to medium-size businesses, Qualbe Marketing Group served our own online brands. We built our expertise marketing these products and brand names.
It wasn’t always easy. Nearly five years ago, one of our brands was penalized by Google. It was from this experience that we learned the power and responsibility of SEO. This is our story.
What Happened?
We thought we were doing everything right. We had a great team; we were making sales, growing our business and increasing our staff… but then our traffic tanked. What happened?
Essentially, we had stopped paying attention to the latest Google Search Engine updates. Acceptable SEO practices for Google had changed in ways that we couldn’t afford to miss:
- Link tactics that were once tolerated were no longer acceptable. (One example: using comment linking as your primary form of link-building–commenting on other blogs with a link back to your website).
- Spammy websites had begun linking to us in droves. Google saw this trend and decided we must be a spammy website, too!
- We had inconsistent and duplicate content in two key areas: our Spanish website and affiliate websites.
- Our pages were inundated with the same keywords. We were guilty of keyword stuffing and didn’t know any better. Google had established that this was a bad practice and no longer acceptable.
It sounds really bad when you read it now, but think back 5 or 10 years ago and you’ll remember that many businesses were using these kind of tactics. That’s just what many businesses did to get seen online.
In 2013, one of our brands was hit by two Google penalties. One of these penalties was a manual penalty (hit in early March 2013) and the other was algorithmic (hit in early October 2013), known as Penguin 2.1.

What’s the Difference?
Manual Penalty
A manual penalty comes from a Google representative who has personally reviewed your website and determined that it breaches one or more of Google’s guidelines. It is usually accompanied by a message from Google.
Algorithmic Penalty
An algorithmic penalty is usually categorized as either Panda or Penguin. Panda penalties deal with usability of your website and the quality of your site’s content. Penguin penalties deal with over-optimization of your website.
While some people think algorithmic penalties don’t exist anymore, that isn’t actually true. Algorithmic penalties do still exist. Moz puts together an Algorithm Change History that can help you monitor fluctuations and avoid potential hits.
However, there is an important difference between the effects of an algorithm penalty and an algorithm change. A slight dip in your site’s rankings in the search results does not mean you have been penalized. You are probably just experiencing natural fluctuations in Google’s algorithm.
So how do you tell the difference? Find out if your competitors are seeing changes, too. You can do this by searching their website domain in SEMrush. Once your report completes, look at their “Top Organic Keywords” by choosing “View Full Report.” You can monitor any major ranking fluctuations using this tool.
An algorithmic penalty is more severe than an algorithmic change because it means your website’s rankings have drastically tanked or been removed altogether. In addition, an algorithmic penalty is harder to detect and could, at one time, take longer to refresh once you have fixed the problems.
In August of 2013, we hired an SEO agency to help us figure out the site issues that had caused our manual and algorithmic penalties and come up with next steps for fixing them.
They found the following problems with our website at the time:
Penguin Penalty: Unnatural Links to Our Site
Google launched the Penguin Update in April 2012 to catch sites that were spamming the search results, especially by buying links or obtaining them through link networks designed primarily to boost Google rankings.
Google launched Penguin 2.1 in October 2013. This was the fifth version of Google’s spam-fighting algorithm update. This was when we were hit.

1. Problem: We had a very spammy backlink profile. Practices that Google had not publicly condemned had been penalized, such as commenting on other blog posts with a link back to our own website. We also had acquired a lot of bad backlinks without realizing it. Example: Auto dealerships, which were completely unrelated to our industry, had linked to our site. Google had begun penalizing links that didn’t seem natural.
Solution: After a deep analysis of our backlink profile, we reached out to the spammy websites and asked them to either remove their links to us or nofollow them. If they did not comply after several attempts to contact them, we would disavow their links by submitting a disavow file to Google, with a full list of actions we had taken to remove the links on our own. Google wanted to see that we had put in the effort to remove these bad links ourselves.
Manual Penalty
2. Problem: A significant amount of URLs on our site were being crawled but not indexed by the search engines, which meant they were not getting organic traffic from Google.
Solution: Google has a series of tools that are fairly unknown except to those who live in digital marketing. We were able to adjust some of these settings to cause Google to look at and record the most relevant content while ignoring information that was least helpful. We did this by optimizing crawl rates and using URL parameter handling.
3. Problem: Our website file that told search engines what pages we didn’t want seen in the search engines (called a robots.txt) did not list all of the right pages and made no reference to our sitemap (the file that listed what pages we did want indexed).
Solution: We corrected our robots.txt file to include the pages we did not want indexed (/tag URLs, archives, comments, etc.) and included the location of our sitemap for Google’s bots that were crawling our site.
4. Problem: Multiple sitemaps were found for our website.
Solution: We got rid of the unnecessary sitemap and combined the other with our original sitemap, so Google would have a clear directory of all of the pages we wanted indexed.
5. Problem: Our sitemap was not structured per Google’s recommendations.
Solution: We made a cohesive sitemap that included a sitemap-index, under which could be found our main sitemap and blog sitemap.
6. Problem: When Googlebot visits a website, it has a predefined limit on the amount of time it will spend crawling the site, as well as the volume and depth of content that it will crawl. Optimizing for this “crawl budget” means ensuring that Googlebot has fast, easy and complete access only to the content that you want to appear in the search engines. Our site was making it difficult on the Googlebots due to 404 errors, dead pages, slow site speed and content that should have been excluded from crawling.
Solution: Several fixes were needed:
- Fix 404 errors and update internal linking structure to prevent future 404 errors from happening.
- Improve page speed of the site. Page speed was 69/100 on desktop and 67/100 on mobile:
- Reduce server response time to under 200 ms
- Enable compression
- Use Content Delivery Network to lighten load of sending images from our web server
- Leverage browser caching
- Eliminate render-blocking JavaScript and CSS in above-the-fold content
- Minimize redirects
- Minimize request size
- De-index pages that should not be indexed using “noindex” tagging and disallow Google from crawling these pages. These pages included:
- HTTPs pages, at the time, that were being crawled in addition to their HTTP counterpart (about 400 URLs)
- Internal search result pages like our different dentist pages (100 URLs)
- blog/tag URLs (1000 URLs)
- blog/page URLs (40 URLs)
- blog/category pages (100 URLs)
- Blog URLs with parameters: replytocom, srp, sra (700 URLs)
- Example:
- blog/comment-subscriptions
- /?srp=*
- /*&srp=*
- /?sra=*
- /*&sra=*
7. Problem: Our website contained internal duplicate content. Some pages on our Spanish site were not fully translated, resulting in duplicate English-language content. Also, blog posts were published in their entirety on the blog homepage and on their individual pages.
Solution: For the first problem, we finished translating the English pages to Spanish. For the second problem, we implemented a “Read More” tag on all of our blog posts so only the snippet introduction could be seen on the blog homepage, enticing readers to click and view the full post.
8. Problem: The Googlebots were finding external duplicate content. Some of our content was used verbatim on other websites, even some websites we owned or had a partnership with.
Solution: We contacted sites and asked them to change the wording on their website; we talked to our affiliates about using a nofollow tag on their affiliate links to us; and we updated the content on our own site so it would be unique.
9. Problem: Versions of www and non-www URLs were found on our website.
Solution: We permanently redirected the non-www to the www.
As you can see, we had a lot of problems and two penalties to overcome. In the meantime, our website traffic had tanked, calls were slow and we were trying to brainstorm other methods of getting sales to our business.
The solutions needed were not all quick fixes. Some took a few tries to get right. We spent months going through each error, poring over our bad links and contacting sites to fix things that were causing issues for our website.
Our work began to yield results. The total number of broken links started to decline and other issues were being resolved. My job was to identify the issue, find a solution and either implement the change myself or send it to our IT department to fix. It was a team effort.

After going through all of our errors and believing we had resolved all of our link issues, we sent our first reconsideration request to Google. We received this message back:

We hired a new consultant to help us overcome our penalties and assist us in finding what we had missed. She helped us go through our backlinks again, send a disavow file to Google and submit another reconsideration request.
After what had felt like an eternity of no visible progress in the SERPs and no positive response from Google, our reconsideration request was approved March 23, 2014.

The manual penalty had been revoked, which was a relief, but the Penguin penalty still existed. Even though we had done everything we needed to do to resolve our site’s issues, the Penguin penalty couldn’t be revoked until the next algorithm refresh.
We finally saw the algorithm refresh October 17, 2014 (more than a year later!). We had donuts that looked like penguins to celebrate that morning, and I needed a vacation.


How Did We Survive the Waiting?
While this might seem like a silly question to some, businesses that have felt the weight of these penalties can understand what I mean. There were many businesses that went under. They could not survive this dramatic decline to their organic traffic in the search results and either did not know how or did not have the means to recover.
Getting caught in a Google penalty is serious. With a drop in traffic, our company saw a significant drop in sales. We had disappeared from the SERPs, and anything we tried to do to improve our web traffic and rankings was capped by Google.

We didn’t give up though. As we worked on improving our site and getting back into Google’s good graces by following their Website Guidelines, we began implementing new digital strategies to gain more traffic and visitors. Whenever the penalty was finally lifted, we’d be ready.
- We improved our social media strategy by…
- updating our existing social media pages, resolving duplicate pages and creating pages on newer social media channels
- posting on social media more frequently
- We established our buyer personas. This helped ensure we were reaching the right audience and not wasting time and money targeting the wrong people.
- We changed our blog strategy by…
- Producing more educational content related to our industry for our readers
- Targeting a different persona each month. One month we wrote to seniors; another month we wrote to parents; the next month we wrote to working professionals. (We had some overlap, but we saw positive results)
- Promoting our content more aggressively on social media
- Collaborating with other experts in the industry
Another way we survived those hard months was through the use of PPC. Our PPC team did an amazing job at bringing in traffic and sales, and they still do to this day.
Did We See an Immediate Improvement When the Penalty Was Lifted?
This was the discouraging part. While we did see a small boost after our manual penalty was lifted, for a while we still felt like we were hitting a ceiling we couldn’t get through. Our rankings would only improve so much. We couldn’t get back to where we were pre-penalty, and it was disappointing.
Fortunately, once the Penguin algorithm update happened in October of 2014, we saw a significant increase in our organic traffic. Our penalty had been lifted, and the work we had put into resolving our site’s issues and building better links, content and pages for our users paid off.

We had anticipated that when the algorithmic penalty was finally lifted, we would need to have strategies in place to gain back the links we had lost (this time replacing them with good, natural links that would boost our sites instead of hurt it), pages that were beneficial to our users and a site that abided by Google’s Guidelines.
We found a couple of useful link strategies that helped us build up a better backlink profile:
- We Offered a Scholarship
- Our most successful link-building campaign was our scholarship campaign. It brought in natural, high-quality backlinks that gave our site a boost and provided us an opportunity to contribute to students’ education.
- We Developed Helpful Resources
- We produced several guides and new infographics surrounding our industry and began gaining some positive links in return.
Now, years later, our organic traffic is almost to where we were pre-penalty.
Penalties cause your business to take a hit and may take a while to recover from, but if you can survive them and learn from your mistakes, you’ll come out stronger on the other end. It takes patience, perseverance and a great team working together to move forward.

Do Google Penalties Still Exist Today?
Yes… although the waiting may not be as painful. Some businesses are still witnessing drastic declines in their search traffic as a result of spammy link practices, but most can bounce back quickly. The release of Penguin 4.0 was a serious update to Google’s algorithm. It can now refresh in real-time. If you find issues on your site today and resolve them tomorrow, you might see improvements to your rankings the following day—a drastic improvement to what we experienced all those years ago.
Manual penalties certainly still exist, too, so they’re something to be mindful of as you operate your business’s website.
If you learn anything from this post, we hope you understand the importance of SEO and why it needs constant attention. We learned that the hard way.
SEO is complicated and changes a lot. We’ve been able to grow our SEO expertise through years of hard work.
If you aren’t sure if your site is compliant with Google’s guidelines, consider having an SEO agency you trust conduct an SEO audit of your website. A thorough SEO audit can identify any issues that could be potentially hazardous to your site’s standing with Google.
We can help you with that! We’ve been there. We know the warning signs and know the importance of handling these issues with the care and attention they deserve.