How to fix duplicate content once and for all
Once we have identified them, we must investigate the causes that led to the formation of duplicate content. Let's discover some of the most frequent cases and the related corrective strategies that involve the use of canonical, redirect and noindex .Url: WWW vs No WWW and HTTP vs HTTPSSwitching to an HTTPS certificate is always a good and right thing. Migration, however, can hide pitfalls. For example, an incorrect webserver configuration can cause the same domain to still respond with both protocols, thus generating duplicates and cannibalization.A situation very similar to the one in which a preferential version is not chosen between the domain with the www and the one without www .In both cases we resolve with a 301 redirect on Special Data the best version . Once redirected, you need to check the outcome with a redirect checker .Faceted Navigation: Filters and SortingFilters are the cross and delight of every e-commerce. When they are present we use the technical term Faceted Navigation . The advantages obtained on UX can be nullified by poor management of canonicals . If these are not implemented, we will find the SERPs full of parameterized URLs presenting identical content.Let's start with category orders : they are used to order products based on an attribute (price, recency and so on). The products are always the same and the content is duplicated. The solution? Insert a canonical on the reference page . For filters, however, the matter becomes more complex and would deserve a separate article or, better yet, dedicated SEO consultancy .Case-Sensitive URLIn Google's eyes, URLs are case-sensitive . For this reason we consider:sito.com/product/martello-grandesito.com/product/Martello-Grandesito.com/product/MARTELLO-LARGElike real duplicates. In this case we recommend a 301 redirect on the preferred version and using consistent internal linking .URL with and without slashesSimilarly to the previous case, in the presence of a slash Google deals with:sito.com/product/martello-grandesito.com/product/martello-grande/as different URLs . Once again we solve the problem by choosing a preferred version (with or without / depending on taste) and setting a 301 redirect on the other.Tag Pages and ArchivesThe same goes for tags as for faceted navigation, when they are not managed correctly they cause more problems than anything else. While often not strictly duplicated , too many such pages can cause crawler indigestion.Better few but good ones .
http://www.asiadata.co.uk/wp-content/uploads/2024/03/Special-Data-3-300x150.png
If they bring or could bring traffic , we leave them indexed . When they are not interesting for traffic but are used by users, we insert a noindex . If they are not useful to anyone, well then all that remains is to remove the URL from Google .Conclusion: Avoiding duplicates improves SEOManaging an SEO project is a bit like completing a puzzle . The pages are the pieces, the more there are the more difficult the puzzle becomes. Well, if among these we also have duplicates, well then the game becomes even more complicated.Let's be smart and plan ahead.For example, if you are thinking about restyling your site, this could be the best time to fix this aspect. By changing the URL and filter logic, we could avoid future duplicates going through the robots.txt .Would you like to know more? Let's talk about it in the comments.
頁:
[1]