Contact UsWDN News & more...

How much is too much? Faceted navigation and SEO

On e-commerce web sites, faceted navigation plays a severe role in allowing consumers to search out the products they wish rapid.

Foremost web sites in overall offer a unfold of filters (content a subset of products), model orders (content products in diverse orders) and pagination (spoil long lists of products into extra than one pages).

From a domain positioning (SEO) standpoint, growing extra pages that fetch away diverse aspects of products customers would maybe per chance also search on is in overall an ethical thing. Offering extra pages lets you compete extra effectively for the long-tail of witness for your label. These uniform resource locators (URLs) additionally build it easy for customers to send links to friends or kinfolk to appear explicit product picks.

Too powerful of an ethical thing

On the lots of hand, it’s additionally doable for there to be too powerful of an ethical thing. It’s seemingly you’ll per chance well maybe reach a level the attach you’re growing too many pages, and engines like google and yahoo will delivery to gaze those incremental pages as thin content.

Scamper too a long way, and likewise it’s seemingly you’ll per chance well even receive a thin content penalty love this one:

But, even without receiving a penalty, adding too many pages can cease in a tumble in visitors, equivalent to what you look here:

So how enact you recognize what’s too powerful? That’s what I’ll address in on the new time’s post.

Examples from prominent retail web sites

Ever surprise the manner you fetch to the level of having too many pages? After all, pages are things that customers would maybe per chance also accumulate. Completely it would build sense to fetch and index all versions of your product pages.

As an instance rather additional, let’s prefer a see on the aptitude number of pages on the Zappos space that present to men’s Nike trainers:

The numbers shown are the number of doable picks in each category.  There are thirteen doable sizes, eight widths, Sixteen diverse colors etc. Multiplying this all out, that suggests there are over 900,000 seemingly pages in this category. That’s what number of pages would fetch created if all combos of picks had been licensed collectively.

Even if Zappos filters out your entire combos for which there are no products, there are maybe many combos the attach there are easiest one or two products. All of these pages will gaze remarkably love the actual person product pages for those shoes.

Let’s now prefer a see at an example of lipstick within the marketplace on Amazon. Right here’s what we fetch there:

That’s quite a pair of diverse forms of lipstick! As with the Zappos example, it’s seemingly that many combos of filters will cease in pages displaying easiest one or two products, and this would maybe per chance well also be beautiful problematic from a thin content standpoint.

Let’s talk pointers

Quite lots of it’s seemingly you’ll per chance well even be pondering, “Sites love Amazon index all their pages, why can’t I?” Well, the straightforward resolution is, as a outcome of you’re no longer Amazon.

At some stage, your recognition and the search recordsdata from for your space play a job within the equation. Sites that look extremely high search recordsdata from ranges appear to fetch extra degrees of freedom in what number of pages they fetch through faceted navigation.

On the lots of hand, this does no longer repeatedly work to Amazon’s advantage. As an illustration, when you search on “mens DKNY jeans,” you fetch the next outcome:

Every space that ranks has a category/filtered navigation page other than for Amazon, which ranks with a product page. This means of indexing every little thing would maybe per chance also be detrimental for Amazon as wisely; they’re correct able to rank with non-optimal pages and sure no longer as wisely as they’d per chance be if they made an attempt to limit crawling to an more moderately priced spot of pages.

For the document, Google disclaims the existence of any domain-stage authority metric that would maybe per chance well philosophize why web sites love Amazon beget extra degrees of freedom around thin content than other lesser-known web sites.

Google additionally says they address Amazon (and other extremely visible web sites) the the same as one another space.

I’ll prefer their note on this, but that doesn’t indicate there aren’t other metrics available which would maybe per chance well be applied equally to ALL web sites and motive a pair of of them to be extra sensitive to thin content than others.

As an illustration, any particular person engagement stage analysis would give an advantage to famous manufacturers, as a outcome of customers give manufacturers the revenue of a doubt.

For lesser-known web sites, there is clearly extra sensitivity to the advent of additional pages within the Google algorithm. The visitors chart I shared above is an example of what took dwelling to at least one space’s visitors when they did a enormous-scale buildout of their faceted navigation: They misplaced a beefy 50 percent of their visitors.

There modified into once no penalty concerned with the formulation, correct Google having to take care of extra pages on this than modified into once moral for this space.

Right here’s what took dwelling when the problem modified into once mounted:

Guidelines and abet

So, what pointers when you apply to lead clear of indexing too many faceted navigation pages?

Unfortunately, there is no such thing as a one-size-suits-all resolution. To be distinct, if there is particular person designate in growing a page, then you positively would maybe per chance beget to fetch it, but the quiz of whether you enable it to be indexed is a separate one.

An ethical beginning dwelling is to spot some rules up for indexation across the 2 following ideas:

  1. Don’t index faceted navigation pages without a longer as a lot as “x” products on them, the attach “x” is a pair of number larger than 1, and maybe larger than 2.
  2. Don’t index faceted navigation pages without a longer as a lot as “y” search quantity, the attach “y” is a host you approach at after testing.

How enact you accumulate “x” and “y?”

I gather be taught how to enact here’s through testing. Don’t prefer your entire space and all straight away build out a enormous faceted navigation design and enable each single page to be indexed. In present an explanation for for you the massive design for the revenue of customers, by all way enact it, but block indexation for the extra questionable share of the architecture before every little thing, and regularly check growing the indexable page depend over time.

As an illustration, it’s seemingly you’ll per chance well also before every little thing delivery with an “x” designate of 5 and a “y” designate of 100 searches per 30 days. See how that does for you. Once that’s distinct, if every little thing looks moral, it’s seemingly you’ll per chance well strive lowering the values of “x” and “y,” per chance on a category-by-category foundation regularly over time.

This model, when you go previous the pure limit for your space and label, it received’t content itself as a anxiousness, equivalent to the example I confirmed above.

Summary

As I’ve renowned, spot up your faceted navigation for customers. They come first. But put into effect controls over what you enable to be indexed so that it’s seemingly you’ll per chance well rating the correct doable SEO designate on the the same time.

The most typical instrument for preserving a explicit aspect out of the index is using a rel=canonical tag to content the page’s father or mother category. This can work wisely for a space.

A 2nd resolution would maybe per chance well be the NoIndex tag.

That acknowledged, my current way is using asynchronous JavaScript and XML (AJAX) to lower the advent of pages you don’t desire in search engine indexes. For those that recognize that you don’t are searching for to index your entire pages from a category of aspects, then AJAX is a model that it’s seemingly you’ll per chance well enable customers to still look that content without it in fact displaying on a brand new URL.

This no longer easiest solves the indexation dispute, but it absolutely reduces the time that engines like google and yahoo will insist crawling pages you don’t intend to index anyway.

One incorrect technique to adjust the crawling of aspects, without using AJAX, is to disallow sure sets of aspects in robots.txt.

This solution has the good thing about lowering crawling whereas still allowing engines like google and yahoo to come support the pages in search results if other signals (in explicit on-space and off-space anchor text) imply the page is an ethical outcome for a explicit quiz.


Opinions expressed here are those of the guest writer and no longer necessarily Search Engine Land. Group authors are listed here.


About The Writer

Eric Enge is the CEO of Stone Temple Consulting, an SEO consultancy outside of Boston. Eric additionally writes for The Digital Advertising and marketing Excellence blog and would maybe also be adopted on Twitter at @stonetemple.