There are two types of SEOs: the white hats and the black hats. Today, we’re going to talk about black-hat SEO, which if you haven’t already guessed, is the kind that will get your site penalized.
Although optimizing your website is necessary if you want your customers to find you on search engines, there is such a thing as over-optimization. Bending (or outright breaking) the rules in order to gain higher rankings will produce the exact opposite result: you’ll most likely get hit by a Google penalty and experience a severe drop in rankings.
What are some old SEO tactics you should avoid and how can you fix them?
Practice 1: Duplicate content
Duplicate content is when you have the same copy or description on two or more pages on your site or meta descriptions. This presents a problem, because search engines won’t know which version is more relevant to a user’s search query and will thus be forced to display one that might not be the original or the best version.
Let me be clear. Not all duplicate content is bad. There are plenty of reasons your site can have duplicate content, like having www vs. non-www versions of your site or pagination issues on your blog. However, these are technical issues that your site webmaster or SEO team can fix relatively easy enough.
The problem is when duplicate content is malicious in nature, done for the express purpose of manipulating search engines to gain higher rankings. These can include:
- Scraped content: When you take content from another site and either republish with little or no modification
- Multiple websites: When you create two (or more) websites under different URLs but for the same business
- Using the same content for all of your products or locations: Using the same description for different products and/or locations (prevalent in ecommerce and local business sites)
Here’s a good visual by MOZ that explains why duplicate content is a problem.
In the past, this used to be okay (sort of), not because Google didn’t ban them (they did…multiple times) but more because sites didn’t get caught as frequently. However, Google’s algorithm now cracks down on these tactics, because displaying a duplicated page on a site unrelated to the search query creates a bad user experience.
How to fix manipulative duplicate content
Scraping content from another site in order to avoid writing copy or creating multiple websites in an attempt to rank higher for particular keywords is the lazy route, and it will ultimately get you a Google penalty that will result in a drop in rankings.
Instead, take the time to write original copy for each page, product, and/or location. Though time-consuming, this is a better strategy in the long-run, because you not only enrich their knowledge but you also give search engines a better idea of what a page is about.
Practice 2: Keyword stuffing
“Keyword stuffing” is the practice of loading your site copy or meta descriptions with your keywords in an attempt to rank higher for that keyword. Here’s an example:
Say your keyword is “NJ flower shop,” and throughout your site, you have sentences like: We are a NJ flower shop that sells NJ flowers. Visit our NJ flower shop today to speak to one of our NJ flower shop attendants!
Not only was that not compelling but it was also hard to read, thus creating a bad UX. Panda 4.1, Google’s most recent algorithm update from September, penalizes sites that have thin content overloaded with keywords.
Need more convincing? Here’s a video of Matt Cutts, head of Google’s Webspam team, from 2011 talking about why keyword-dense content is bad:
How to fix keyword-stuffed site copy
I’ve said this before and I’ll say it again: Write for humans, not for search engines! Google focuses less on keywords and more on long-tail search queries in order to display results that better match a user’s search intent.
Therefore, focus on creating quality content that clearly answers your customers’ questions and enriches their experience on your website. I’m not saying that you shouldn’t add keywords in your site copy. However, use them sparingly, naturally, and in-context.
Practice 3: Spammy links
The most common link-building scheme is paid links, which is exchanging money for a link or a post containing a link. Google also says that they may penalize sites that create native advertisements, as that’s just a paid link disguised as editorial content. This is why you’ll see that reputable sites that publish native ads either don’t contain links back to the sponsor company or they have a rel=nofollow on the links to avoid a Google penalty.
The SEO myth that the more backlinks you have, the higher your rankings will be is false. It’s not about the quantity of your backlinks but rather the quality.
How to fix spammy links
There’s no shame in having spammy links in your site history. Honestly, most domains that have been around for a long time probably have several, just because that’s how things used to be done back then. The only shame now is if you didn’t do anything about your spammy links.
Analyze your backlink profile by going through every backlink and seeing whether they are legitimate or spammy. Contact the webmaster of those spammy sites and ask if they can remove the link back to your site. If they don’t, disavow those links, as those links will weigh down on your site authority and decrease your rankings.
The Bottom Line
Implementing tactics for the sole purpose of increasing your search results ranking is bad. To rank well on search engines, you have to take into account your entire Web presence, which is a combination of links, content, social, and onsite factors.
Therefore, implementing an integrated digital approach that combines design, development, and digital marketing in order to create a better user experience for your customers will not only increase your rankings but will also inspire more loyalty from your customers.