Safe guarding your website from Googles updates is an ever changing game. Google makes major changes to their algorithms every year. Some they announce and some they don’t. Marie points out some basic safeguards if you want to keep your site from breaking the rules.
I was not relly aware of the last item on the site maps and how they can be edited. We will be reviewing these steps in detail in the future.
4 Ways Affiliate Marketers Can Safeguard Their Sites From Google Algorithm Updates
The power of Google is undeniable: With an estimated 30+ billion queries per month across all platforms, and as much as 95% of global mobile search (63%+ on desktop), it’s the go-to search engine for most people worldwide.
Notice the most important word in that last sentence: people.
As Google itself will tell you, its job is to give users “useful and relevant results in a fraction of a sentence.” This is why there are frequent tweaks to their search algorithm, which is the process by which Google finds, ranks and delivers results.
In early 2017, the most recent wave of Google algorithm updates, “Fred,” was launched, and similar to the Panda updates in 2011, it deeply impacted search results for what Google calls “thin content” sites. For sites that prioritized search results over user experience, it was particularly devastating. For many affiliate and retail sites, the impact was significant. Any site that, in Google’s estimation, had too many ads and/or that offered low-value content saw a drop in search traffic of reportedly 50 to 90 percent.
Fred not only penalized sites that emphasized SEO over user experience, but also some quality sites with a perfectly acceptable user experience that had structural and content-related pitfalls.
So whether you’re still trying to recover from a hit to your webpage rankings by Fred or want to algorithm-proof your site for future updates, there are several actions you can take to safeguard your site’s visibility to your target users through search engines.
1) Make sure ads add value to the overall user experience and aren’t confusing, obtrusive or disruptive.
Google has clear guidelines on using its own advertising platforms. These aren’t just helpful hints; now it’s clear that they’re directly related to rankings, too.
For example: “Publishers should avoid site layouts in which the ads push content below the fold. These layouts make it hard for users to distinguish between the content and ads.” [source]
Keep in mind that the ads and affiliate links you include on your site are there to enrich the user experience, not overwhelm it. Done well, affiliate links can be useful and add value, but they should not be the primary focus of a website.
2) Use links sparingly and only when they truly add value.
Both Fred and Panda aimed to cut the “link juice” line as a means for publishers to fuel revenue generation. A laundry list of affiliate links is not only confusing, it’s also not helpful. Remember, Google algorithm updates are all about assisting the user, so make sure your links are not only valuable, but that they’re also in context with beneficial content.
For example, at eBay we’re re-building our search results pages to include helpful information for customers who are searching for a specific product. We’re moving toward an easy-to-browse experience that includes a snapshot of related products and detailed information so users can easily compare and contrast products. We believe this will better serve shoppers, especially those who come to eBay from a search engine or affiliate website.
3) Manage your search footprint – less is more.
Your crawl budget is one of your most precious commodities. Every link has to be crawled by Google. Duplicate pages or those that don’t add value to customers make the search engine’s job difficult — which is something you absolutely should avoid doing.
The perfect example of duplicate affiliate content that Google’s Fred update penalized is “daily deal” sites that create multiple pages for a given product, changing only the price and merchant. When thinking about all the deals/products/categories included in a site like this, you can see how quickly proper indexing becomes an impossible task. This dilutes the site’s value to the user.
- To correct this issue, try the following: Consolidate duplicate pages into a single useful page that provides the content once (specs, description, images, etc…) and shows the differentiating factors. (This may be simplest to do going forward if you’ve been in the habit of creating multiple pages of duplicate content.)
- To check out how Google has indexed your older content, try this expert hack:
- Open a new browser window and type in site:yourwebsitename.com check. (Note, there’s no space between the colon and “site” or your domain name.)
- Scan the results to see if you have duplicate listings for the same pages.
- If you do, log into the backend of your site and delete the duplicates, leaving just the original page.
- You can also look at Google Search Console for HTML improvement suggestions (under “Search Appearance”). Check to see if there are any complaints from Google about duplication or missing titles — and if there are, fix those problems immediately.
- Finally, be strategic about creating pages around keywords. If you use the same keyword over and over on multiple pages (which is an old-school practice), Google won’t see your site as an authority on a particular subject related to that keyword; it will simply see it as run by someone who didn’t get the memo that keyword stuffing is a sure sign of a low-value, low-quality website.
4) Manage your pages with a user-friendly site map.
It takes work to structure a site so it’s easy to browse. But taking the easy way out by providing a very top-level category structure and placing all items under it makes it much harder for both Google and users to find their way around your site — and that becomes a big problem for you.
You can’t expect Google to find your pages if you don’t link to them or if you’re providing nothing more than a set of your own keyword-based search results pages. To make your site easy to browse, consider giving google a map to easily crawl the site. The best way to do this is by using taxonomies like tags and categories in your XML sitemap. A little strategic site mapping goes a long way in supporting strong organic search results.
All of these actions add up to one basic principle: make sure your website is made for people, not just search engines.
Because when you keep the user experience front and center, you’re actually ensuring that your site is search-engine friendly, too. Google’s organic search favors sites that provide real value and have real utility when it comes to helping real people find (and buy!) what they’re looking for: And that’s the best way to safeguard your site — and search results — from Google’s next algorithm update.
By Marie Langhout-Franklin
Head of Marketing, eBay Partner Network