Chris Long – Search Engine Land News On Search Engines, Search Engine Optimization (SEO) & Search Engine Marketing (SEM) Sat, 28 Aug 2021 01:10:37 +0000 en-US hourly 1 Featured snippets: The 9 rules of optimization /featured-snippets-the-9-rules-of-optimization-342627 Mon, 02 Nov 2020 13:00:00 +0000 /?p=342627 Here's a set of guidelines to follow when optimizing for the featured snippet - like using an “is” statement and defining the topic concisely.

The post Featured snippets: The 9 rules of optimization appeared first on Search Engine Land.

In SEO, some might consider featured snippets the holy grail of rankings. Featured snippets are a great low hanging fruit opportunity for many websites. By optimizing for the featured snippet, you can propel your site to the top of the search results by only making very small adjustments to your page’s content. Throw away backlinks, performance, and site architecture (only kidding). Featured snippet optimizations allow you to bypass all of that for a chance at ranking in the first position without having to worry about all of those other factors. 

The goal of this post is to provide you with a set of rules to reference when you find featured snippet opportunities. Think of this as a checklist to run through when you’re brainstorming how to optimize for featured snippets in your keyword set. 

What is a featured snippet?

A featured snippet is a two to three sentence summary of text that appears at the top of Google. Featured snippets provide an answer for a user’s query directly in the search results. Receiving a featured snippet can result in more traffic for a given page. 

The steps to receiving a featured snippet are as followed:

  1. Add a “What Is” heading
  2. Use an “is” sentence structure
  3. Fully define the topic
  4. Match the featured snippet format
  5. Don’t use your brand name
  6. Don’t use first person language
  7. Scale featured snippets
  8. Prioritize when you rank in the top five
  9. Iterate your optimizations

The featured snippet appears to work on a more simplistic algorithm than Google’s “primary” one. The featured snippet is much more influenced by simple on-page adjustments that very clearly define the topic to users. 

Featured snippets and voice search 

As well, keep in mind that one of the goals of the featured snippet is to fuel voice search. Google reads back featured snippets when users perform voice queries on mobile or Google Home devices. This means that featured snippets must always make sense in this context. When optimizing for featured snippets it makes sense to ask yourself “How would my answer sound if read back on voice search?”

The types of featured snippets

When optimizing for featured snippets, you might notice that there are several different types. It’s important to be aware of these different types so you understand how to structure your content to optimize for them. The most common types are as followed:

Paragraph: Two or three sentences of text pulled from a <p> HTML element. These are the most common type: 

List: A bulleted or numbered list generally pulled from either an <ol> or <ul> HTML element: 

Table: A table of information pulled from a <table> HTML element. These are the least common type: 

How to optimize for the featured snippet

Throughout the years, one of the things I’ve been able to hone in on is how to optimize for the featured snippet. I’ve developed a set of rules that I follow when optimizing client pages for this SERP feature. 

You can learn more about each rule below: 

Rule #1: Add a “What Is” heading

To start your featured snippet optimizations, you’ll want to look for a place in your content to add a “What Is [Keyword]” heading tag. This sends clear signals to Google that text that could be used for the featured snippet is upcoming. We’ve seen countless examples of pages that receive the featured snippet using this heading format. When replicating this strategy for our clients, we’ve seen very good success rates. 

Ideally, you’ll add this heading as close to the top of your content as possible. If writing a blog post, I’ll generally add it right below the introductory paragraph. This is often a great place to add it that flows well with the content while allowing you to include it near the top of the page. 

For instance, here’s a great example on TechnologyAdvice. We can see that they include this section right below their “Table of contents” at the top of the page: 

By adding this section, this gives Google a straightforward indication of what text they can pull into the featured snippet. The result is that they receive the answer box for the competitive keyword “project management software.” This actually helps them perform above Capterra which is very difficult to do in SaaS SEO

Rule #2: Use the “is” sentence structure

When optimizing for the featured snippet, it’s really important to include an “is” statement. This means that the very first sentence should start with the structure: “[Keyword] is”. Below are some examples from results that are getting the featured snippet: 

Agile methodology is a type of project management process, mainly used for software development…”

Customer relationship management (CRM) software is software that automates and manages the customer life cycle of an organization.”

Return on Investment (ROI) is a performance measure used to evaluate the efficiency of an investment or compare the efficiency of a number of different investments.”

When analyzing pages that are receiving featured snippets, we consistently see that “is” statements are utilized within the text. In our experience, structuring content this way appears to act as a “triggering phrase” that allows Google to easily find the text that’s relevant for the featured snippet. 

When optimizing your pages for the featured snippet, try to ensure that your first sentence follows this format. By using an “is” statement, you should see a higher percentage of your optimizations result in winning the featured snippet. 

Rule #3: Fully define the topic in 2-3 sentences

To me, this is the most important rule to follow. 

Feature snippets are meant to give users as much information about the topic as possible in a short amount of time. This means that the content your optimizing must try to describe the topic as completely as possible in two to three sentences. For this rule, being concise is extremely important. 

Here are some general guidelines we try to follow when trying to concisely define featured snippets

  1. The first sentence should define the topic
  2. The second and third sentences should describe 2-3 must-know facts about the topic
  3. Try to avoid using any extraneous phrasing in your definition

Here’s a great example from Investopedia:

  1. Forensic accounting is a specific accounting technique to discover crimes
  2. It’s used to provide evidence of financial misconduct to courts
  3. Forensic accounting is heavily used in the insurance industry 

This follows the above pattern of first describing the topic and then providing users with two must-know facts about it. Also notice how the text doesn’t use unnecessary words within the definition. It’s short and to the point. 

Rule #4: Match the featured snippet format

As we explained earlier, there are several different types of featured snippets. These are: 

  1. Paragraph (most common)
  2. Bulleted & Numbered List
  3. Table (least common)

This rule is extremely simple. Whatever featured snippet type you see on the SERP, match that type in your content. 

For example, if you see that a paragraph featured snippet is appearing for the term you want to optimize for, then you need to find a place to add/adjust two or three sentences of text. However, if you’re seeing a bulleted list appear, then you need to add a similar bulleted list to your page’s content. 

Rule #5: Never use your brand name in featured snippet text

This is a mistake that we see all the time when optimizing for the featured snippet. A company will get rules 1-4 right but will use some language that makes the result ineligible for the featured snippet. 

Brand names are one example of such language. 

Remember that featured snippets are used to fuel voice search. Devices such as Google Home will read what’s in the featured snippet directly back to users. This means that what’s in the featured snippet needs to make complete sense in this context. 

As an example, think about if Wegmans was trying to optimize for the featured snippet “health benefits of avocado”. Let’s say we used the following sentence to try to optimize for the featured snippet: 

Avocados from Wegmans have many health benefits as they are a great source of riboflavin, vitamins C and potassium.” 

This might be confusing if read from voice search. This user was looking for general benefits that apply to all avocados, not just the ones sold from Wegmans. 

By replacing the brand name with general language, this will give the content a higher chance of receiving a featured snippet. Instead, the optimization could look something like this: 

Avocados have many health benefits as they are a great source of riboflavin, vitamins C and potassium.” 

Rule #6: Don’t use first person language

Similarly to rule 5, using first person language can be a mistake due to the ramifications of voice search. Using the above example, let’s say that the on-page text that was optimized for the featured snippet read: 

Our avocados have many health benefits. We have avocados that are a great source of riboflavin, vitamins C and potassium.

Once again, imagine if this sentence was read from voice search. The user might be left confused and wondering: 

  1. Who is the “we” that is being referenced? 
  2. Does this information only apply to their product? 

Once again, this sounds like the information might be specific to a certain type of avocado but might not apply to the food in general. Limiting this type of phrasing may also help you improve your chances of receiving a featured snippet. 

Rule #7: Scale featured snippets when possible

Throughout the years, we’ve seen interesting behavior with bulleted list featured snippets. For example, you can see that a search for the term “food franchises” yields the following featured snippet below: 

However, when looking at the page, there is no specific bulleted list to be found. 

Instead, this page is set up as a standard eCommerce category page. What Google appears to be pulling the featured snippets from is actually the individual product listings within the category page: 

These all appear to be formatted as H3 tags. This shows us that in some results, featured snippets can result from Google scraping in heading tag information. 

This allows for an interesting opportunity as featured snippets can be scaled with adjustments to the HTML. For some of our clients, we’ve recommended adjusting the HTML on category pages from standard paragraph tags to H2 or H3 tags. This might send stronger signals that could scale featured snippet optimization at a global level. 

Take the time to review where Google is pulling your competitor featured snippets from. If you’re seeing common HTML elements, consider adjusting your global templates to give your content the best chance of triggering the featured snippet. 

Rule #8: Prioritize opportunities where you rank in the top 5

Previous studies have shown that ranking position matters in terms of claiming a featured snippet. Simply put, the higher you rank, the better chance you have at claiming one. A previous study from AHREFs, showed that results ranking in the first position had a 30.9% chance of receiving a featured snippet. Positions 2 & 3 had a 23.5% and 15.9% chance respectively. 

While this data might be different now since deduplication, it still has clear takeaways for SEOs. The higher your site ranks in the “standard” results, the better chance it has of generating a featured snippet. When prioritizing, look for keywords where you already rank within the top 5 results. 

Rule #9: Iterate your optimizations

You’ve followed the steps above. You’ve written fantastic on-page content that clearly describes the topic under a dedicated “What Is” heading at the top of the page. You’ve also been sure to stay clear of any brand or first person terminology. You push your optimizations to production and wait for Google to re-index your content.

When Google finally indexes your new changes, you find that your page still isn’t generating the featured snippet.

This is not the time to stop optimizing. Instead, iterate your approach and try again. For many of the featured snippets we get, it can take multiple iterations.

In this phase, I’ve found that using the above process generally gets you 80% of the way there. If your result still isn’t receiving a featured snippet, I’ve found that very minor adjustments tend to work well. I’ll generally look for opportunities to better define the topic, use even more concise phrasing or test highlighting different facts (see rule 3). Start with minor adjustments and work your way to more major ones if you’re still not seeing the results you want.

Oftentimes, you’ll find that the featured snippet can be obtained after a few rounds of interaction to really perfect the language.


I hope these rules help provide you with a set of guidelines to follow when optimizing for the featured snippet. Remember, it’s extremely important to use an “is” statement and to fully define the topic in two to three sentences. By following the rules above, you should be able to significantly improve how many featured snippets you’re able to receive for your site.

The post Featured snippets: The 9 rules of optimization appeared first on Search Engine Land.

Does Google respect the URL parameters tool? /does-google-respect-the-url-parameters-tool-337656 Fri, 17 Jul 2020 12:00:00 +0000 /?p=337656 E-commerce sites should not make assumptions about Google’s crawling parameter - check the log files to confirm activity.

The post Does Google respect the URL parameters tool? appeared first on Search Engine Land.

Any e-commerce site is probably familiar with the “URL Parameters Tool.” This is a feature in Google Search Console that SEOs have long used to help control the crawl of their websites. In this tool, you inform Google of what your different URL parameters do and how Google should crawl them (“Let Googlebot Decide”, “No URLs” etc). Google has provided extensive documentation on the different settings that can be configured and how the crawl commands interact with each other. 

However, recently Google has moved this tool to the ambiguous “Legacy tool and reports” section. Ever since that time, I’ve wondered what that meant for the tool. Is this just a way of categorizing an older feature? Does Google plan on sunsetting it eventually? Does Google even still use the commands here? 

Something else I’ve found interesting is that when reviewing client log files, we’ve encountered examples where Google didn’t appear to be abiding by the rules set in the URL parameter tool. 

To find out more, I decided to perform a test. I took one of our test sites and found URL parameters that Google was crawling. Using Google’s Index Coverage report, I was able to confirm Googlebot was crawling the following parameters:


On June 26, I went ahead and added these URLs to Google’s URL parameters report. I instructed Googlebot specifically to crawl “No URLs.” 

I then waited and monitored Google’s crawl of the site. After collecting a couple of week’s worth of data, we can see that Google was still crawling these URL parameters. The primary parameter we were able to find activity on was “?cat” URLs:

Zooming out a bit further, you can see that these are verified Googlebot events that occurred on June 27 or later, after the crawl settings had been configured: 

We were also able to confirm crawl activity of both “?cat” and “?utm” URLs using Google’s URL Inspection Tool. Notice how the URLs had “Last crawls” after the new rules went into place. 

What does this mean for SEOs? 

While we’re not seeing overwhelming crawl activity, it is an indicator that Google might not always respect the rules in the URL Parameters tool. Keep in mind that this is a smaller site (around 600 pages) so the scale in which these URL parameters will be crawled is much lower than a large eCommerce site.

Of course, this isn’t to say that Google is always ignoring the URL parameters report. However, in this particular instance, we can see that it might be the case. If you’re an e-commerce site, I would recommend not making assumptions about how Google’s crawling your parameters and check the log files to confirm crawl activity. Overall, if you’re looking to limit the crawl of a particular parameter, I’d rely on the robots.txt first and foremost.

The post Does Google respect the URL parameters tool? appeared first on Search Engine Land.

Blogging for Shopify: A unique SEO approach /blogging-for-shopify-a-unique-seo-approach-335106 Tue, 26 May 2020 15:05:02 +0000 /?p=335106 Shopify stores may need to create blog content to rank for keywords that seem strictly transactional.

The post Blogging for Shopify: A unique SEO approach appeared first on Search Engine Land.

One of the most common opportunities we see for Shopify stores is to create blog content for SEO. Often, we see that Shopify sites are targeting keywords with informational intent with transactional pages. This creates a discrepancy between what content the store has and what Google is “willing” to rank. The result is that the Shopify store doesn’t have pages to support the intent of the keywords. In this post, we’ll talk about ways you can identify and fix this discrepancy. 

Identifying keyword intent for Shopify 

Let’s use an example of a query for the term “selfie camera.” On the surface, this appears to be a transactional query. Users who are looking to perform a purchase for the term “selfie camera” are performing a query to purchase the product. 

However, when we look at the SERPs in Google, we actually see something different:

All three of the top results for this term are informational in nature as opposed to transactional. This means that if you’re a Shopify store trying to rank a collections page for this query, you might not have any luck. 

This is an example of Google displaying what the user intent of the keyword is. By displaying these results, Google is showing us that users want informational “Listicle” types of content for this query. Users don’t just want to see all of the “selfie cameras” that you have available in your Shopify store, they want to see the best selfie cameras that the market has to offer. 

This means that the opportunity to rank for this keyword doesn’t exist within your category page. Instead, the opportunity to rank for this coveted keyword lies within your Shopify blog. 

Related: Shopify SEO Guide: How to increase organic traffic to your store

Here is another interesting example. Similar to the term “best selfie camera,” another keyword that appears to be transactional is “cloth diaper.” Once again, at the surface we would assume that this keyword is queried by users who might want to make a purchase of cloth diapers. 

When looking at the SERPs, we can see that in this instance it is partially true. We can see that this term has mixed intent: 

The top three ranking pages are in fact transactional. Users who are looking to make a purchase for “cloth diapers:”

  1. Cloth Diapers at Cotton Babies – makers of bumGenius
  2. Reusable Baby Cloth Pocket Diapers, 6 pcs + 6 … –
  3. Green Mountain Diapers: Cloth diapers for baby featuring …

However, the results taking up positions 4 & 5 are more informational in nature: 

  1. 6 Best Cloth Diapers of 2020 – Babylist
  2. Cloth Diapering 101: Everything You Need to Know

This is called a “mixed intent” result. There are two different user intents active here. One is transactional (to purchase a cloth diaper), while the other is informational (to learn more information about best practices). Both types of pages are eligible to appear on the first page for this term. This means that there is an opportunity for some of these sites to appear more than once. 

For instance, let’s take a look at the results in the #1 position, This store is clearly already doing a great job with their SEO as they have claimed the first position here. However, there might also be an opportunity for another one of their results to appear. They only are ranking for the transactional intent but not the informational one. 

Looking through their site, they do have a page targeted towards “Cloth Diaper Basics.” This is essentially a blog post type of content under their /pages/ URL path. Looking at the page, we can see that it’s a very detailed FAQ that provides users with answers to all of their questions on the subject of “cloth diapers:” 

This page would make a great opportunity to improve the optimization to better target the transactional nature of the keyword. They could consider making adjustments such as: 

  • Changing the format from an FAQ to “Guide” type of content
  • Organizing the FAQ into clear categories (Benefits, Cleaning etc)
  • Looking for opportunities to full content gaps (types of cloth diapers)

This might give both their Shopify category page and their blog post a chance to rank in the top 10 for a very important keyword of theirs. 

It might seem difficult to rank a page for both transactional and information intent but it certainly can be done. 

For instance, does a great job of this for the keyword “instagram scheduler”. Knowing that this is a very important term for them, they have optimized their home page for the transactional intent and a blog post for the informational intent. The result is that they claim the first and second position for the keyword: 

This strategy that a lot of Shopify stores could benefit from. 

Finding Shopify blog opportunities

So all this information is certainly great but how can you find opportunities for your Shopify blog? Fortunately the process is pretty straightforward: 

  1. Identify your high value keywords: This can be keywords that you know are likely to generate a lot of revenue for your store. You can also use AdWords data to tie your keywords to revenue to find these. 
  2. Manually review their intent: Next, manually perform searches for each of these keywords. Note what types of results Google is actually returning here. Are they informational or transactional or a mix of both? 
  3. Inventory your own site content: Does your Shopify store have the content to match the intent of each keyword? Do you have a category page created for queries returning product listing pages? Do you have posts written for queries with informational content? 
  4. Optimize/Create Content: Next you’ll need to determine what execution steps you will take. If you have the content infrastructure, then you might simply need to optimize and adjust the targeting of existing pages. If not, you may need to create new ones.  

How to create new Shopify blogs for content gaps

If you’ve identified that you don’t have the blog content to compete for your store’s core keywords, you’ll need to go out and create it. This can be a bit of a daunting task, especially if you have to start from scratch. 

Fortunately for you, there’s no better SEO research tool out there than the Google search results. By reviewing them, you’ll be able to see exactly what types of content are ranking well in the search engines

1. Note common topics of top ranking posts

This is probably one of the most important aspects of ensuring that your Shopify blog posts rank well. Oftentimes, users expect to have certain types of questions answered when they’re looking for an informational article. It’s your goal to ensure that your content answers all of those questions. 

To do this, start by noting what types of content the URLs on the first page consistently have. For instance, back to our “cloth diapers” example, we can see that both of the informational articles talk about “Types Of Cloth Diapers”: 

Of course, this means that we’ll want to be sure that our content also contains this information. Both Google and users might expect this content on the page to be considered relevant enough to rank for the term.

2. Review “related searches” 

You can also review the “Related Searches” at the bottom of the search results. This will show you other queries that users generally search around this topic. Oftentimes, this will include ideas of topics that you could utilize in your own content. 

For instance, here are the examples of “Related Searches” that appear for this query: 

While obviously many of these ideas are branded and won’t be a good fit for our content (cloth diapers amazon, cloth diapers walmart), there are still ideas we could use. For instance, “how to clean cloth diapers” would be a great section to add to our pages. You can use this section to find additional content ideas around your core topic. 

3. Utilize hub content

In SEO, the concept of “hub content” is becoming more and more popular. Essentially, we’ll often see Google ranking articles that not only answer the question but also link to other internal resources that the site offers around that topic. As an example, Moz’s “Beginner’s Guide To SEO” ranks very well for the term “SEO,” despite having very limited on-page content. As of this writing, the page contains about 6-7 paragraphs of content at the top of the page. From there is simply an aggregated list of internal links: 

If you’re writing content for your Shopify blog, try to make it a piece of “hub content.” If your site already contains other useful resources that users would find helpful, ensure that you’re linking to them within the post. We recommend linking to them in both the on-page content as needed as well as a section at the bottom called “Resources”. 

This shows both users and Google that you not only have the content to answer the original query but your site can also help answer other variant questions that users might have. 

4. Upsell to product pages

We understand that one of the frustrating parts of this process is that these pages are informational in nature and are inherently less revenue focused. If these pages aren’t going to be revenue drivers, then there’s less incentive to create and optimize these blogs. 

That’s why it’s important to ensure that your Shopify blog is offering upsell opportunities for related products. You can do this in several ways: 

  1. Ensure you link to “Related Products” at the end of posts
  2. Includes links within the posts to relevant product and category pages

This will give your blog posts a better chance of resulting in conversions from users.


While blogging might seem like a second priority for Shopify stores, oftentimes it can be extremely important for SEO. Shopify stores may need to create blog content to rank for keywords that may appear to be transactional in nature but Google is actually ranking informational results. Always be sure you understand the search intent of your high-value keywords as that will be pivotal to your content strategy. 

Looking for more ways to optimize and market your Shopify store?

Check out these resources:

The post Blogging for Shopify: A unique SEO approach appeared first on Search Engine Land.

SMX Overtime: Schema and structured data — hidden gold for SEOs /smx-overtime-schema-and-structured-data-hidden-gold-for-seos-332990 Thu, 16 Apr 2020 13:30:14 +0000 /?p=332990 Technical SEO Chris Long discusses the benefits of schema, MREIDs and markup for e-commerce category pages and more.

The post SMX Overtime: Schema and structured data — hidden gold for SEOs appeared first on Search Engine Land.

SMX Overtime is part of our SMX speaker series from conference presenters who answer questions from attendees on a variety of topics.

Q. Is‌ ‌it‌ ‌worth‌ ‌implementing‌ ‌schema‌ ‌if‌ ‌there‌ ‌are‌ ‌no‌ ‌visible‌ ‌changes‌ ‌to‌ ‌the‌ ‌search‌ ‌result‌ ‌on‌ ‌the‌ ‌SERP,‌ ‌just‌ ‌to‌ ‌help‌ ‌Google‌ ‌better‌ ‌understand‌ ‌your‌ ‌content‌ ‌generally?‌

A. Yes, it can definitely be worth implementing structured data, even if there is no direct impact for rich results in the SERPs. As we know, Google is constantly adjusting the SERP landscape so new rich results are always a possibility. Structured data can help Google better understand the content of a page as well as help it understand how that content relates to entities it identifies. 

Related: Learn more about structured data in the HTML section of our SEO Guide.

Q. Have you incorporated MREIDs in client structured data? If so, have you seen any benefits and any reduced ambiguity with the SERPs for any given queries?

A. MREIDs (machine readable entity IDs) are definitely something that could benefit Google’s understanding around the primary entity of a page.

MREIDs provide Google with a dedicated string associated with a particular entity. This would be similar to how a SKU number references a specific product. While structured data’s direct impact on rankings is a bit nebulous, using MREIDs is a great opportunity to help Google more directly understand the entity, especially when the entity shares the naming convention of others. Bill Slawski has found that Google has used similar Machine IDs in the past to improve image search.

By referencing the MREID in a “sameAs” property, Google could know with more certainty, the exact entity referenced. In a way, this would be similar to referencing the entity’s associated Wikidata page within the structured data. 

Q. What are the best practices for schema markup for product teasers on e-commerce category pages?  Some suggest a list of Products, other documentation says a Product shouldn’t point to another page.

A. Generally, we recommend marking up each item on product listing pages with “Product” markup that defines the information given within the visible content of each listing. Of course, this should be dynamically generated so it always reflects the page’s inventory.

If the goal of structured data is to help Google better understand the content of a page, than defining each item in the list may help Google understand that the page contains an aggregation of products. 

See more articles on structured data and schema.

The post SMX Overtime: Schema and structured data — hidden gold for SEOs appeared first on Search Engine Land.

3 case studies of duplicate content consolidation /3-case-studies-duplicate-content-consolidation-286263 Tue, 14 Nov 2017 16:06:13 +0000 /?p=286263 Columnist Chris Long shares examples of how he addressed duplicate and similar website content to improve organic search performance for his clients.

The post 3 case studies of duplicate content consolidation appeared first on Search Engine Land.


It’s commonly held that duplicate or substantially similar content is bad for SEO. When Google finds duplicate content, this creates a conflict for the algorithm. Essentially, Google gets confused as to which page should be the primary ranking URL for a given search query, so it chooses the one it believes to be the most relevant. Unfortunately, the URL it chooses may not be the one you wish to display — and, in cases of exact duplicate content, the other versions of the page may even be filtered out of search results.

The best way to fix this issue is to consolidate the duplicate/similar content’s ranking signals into a singular version of the page. In practice, this is usually done by implementing either a 301 redirect, canonical or “noindex” tag.

While many of us know this to be true, it can often be helpful to see examples of the different types of duplicate content that exist in the wild and how to best handle them. To better help you find and fix duplicate/similar content, I’ve provided case studies for three different instances where we consolidated these types of pages and noted the results we saw.

1. Consolidating exact duplicate pages

The simplest type of duplicate content issue you may encounter is a straightforward duplicate page. This page will contain the exact same content as another page on the website. For one of our clients that advertises franchise listings, we found two pages that contained the exact same content targeted towards “low-cost franchises.” You can see that the two pages were identical here:

Because there was no need for both of these pages, we suggested that our client 301 redirect one of them to the other page. Personally, I love using 301 redirects if possible because it points both users and link equity to a single URL. It tends to send a stronger signal than the other consolidation methods. Almost immediately, we saw rankings and traffic to the original page spike.

Since the 301 redirect was implemented, organic traffic improved by over 200 percent to the page, and it is now consistently one of the top three pages on the site each month.

How did we decide which page to 301 redirect? To do this, we took a look at three different factors:

  1. Which page the site internally linked to the most.
  2. Which page was currently ranking the best.
  3. Which page had the most organic traffic historically.

The final destination page we selected had the most internal links and traffic and was also ranking ahead of the other. I would definitely urge you to look at these three factors when deciding which page to consolidate your duplicate pages to.

2. Consolidating semantically similar pages

As Google gets better and better at understanding semantically related topics, the search engine is starting to return more results that contain topics outside of the initial query. For instance, in a search for “braces near me,” I see a lot of results for orthodontists, even though the term “orthodontist” wasn’t in my original search. Google is most likely doing this type of consolidation for some of your core keywords, and you should be aware of what it’s grouping together.

The client mentioned above has done a good job of building out landing pages that target different industry options (Auto Franchises, Cleaning Franchises and so on). This included the following two pages: Food Franchises and Fast Food Franchises.

At first glance, it might seem obvious that searches for these two terms might yield different results. However, we were seeing that Google was treating these terms somewhat interchangeably:

It appeared that Google had collected enough user data to determine that searchers wanted similar results for these two queries. Because neither of their pages were ranking well at the time, and they both contained very similar content, we recommended that they consolidate the ranking signals.

Our client still wanted users to be able to access both pages, so we recommended they implement a canonical tag instead of a 301 redirect. They added the canonical tag on the “Fast Food” page that pointed to the “Food” page because the latter gave users a list of all the franchises under both categories.

Once again, the results were pretty convincing:

Organic traffic to the page has improved by 47 percent since implementation. This shows us that it’s important to not only consolidate pages where standard keyword targeting and content overlap, but also where there might be conflict with other semantically related pages.

3. Consolidating URL parameters

Last but certainly not least is looking for URL parameters that Google may be finding and assessing as duplicates of other pages. While it’s not always the case, often URLs with parameters appended to them will contain duplicate or very similar content to the source page.

This was certainly the situation for another one of our clients. We found that many of their key pages were generating a large number of URLs with different parameters. While these pages did contain slightly different content, to the search engines they appeared to be largely the same.

We solved this issue by using a canonical tag. We instructed the client to dynamically implement canonical tags that would reference the primary landing pages that Google should be ranking. As Google slowly removed these URL parameters from the index, we began to see a large shift in rankings.

Organic traffic followed suit, and the website now generates over 800 percent more than our baseline levels.

While we have been working on many other aspects of the site, there’s no doubt in my mind that this was a major factor in the ranking and traffic increases we’ve seen.

Finding consolidation opportunities

All of this begs the question: How do you find instances of duplicate or similar content that can be consolidated?

While there are many different tools out there that can help you with this, I’ve found no substitute to looking manually. When evaluating if duplicate/similar content is a potential problem, I start with a “site:” search of that domain followed by their core keywords. If I see pages with similar meta data in Google’s index, this is a red flag that they may be duplicates:

I repeat this process for as many keywords as necessary until I have a good understanding of the problem.

I highly recommend researching duplicate content issues manually to completely understand the nature of the problem and the best way to address it. Doing so can lead to massive improvements for the organic search performance of an individual page, or even an entire website.

The post 3 case studies of duplicate content consolidation appeared first on Search Engine Land.

3 ways to improve link equity distribution and capture missed opportunities /3-ways-improve-link-equity-distribution-282387 Fri, 22 Sep 2017 14:35:07 +0000 http:/?p=282387 You've worked hard to accumulate as much link equity as possible from external sources, but is your internal linking structure diluting that equity? Columnist Chris Long details how to reclaim your lost link value.

The post 3 ways to improve link equity distribution and capture missed opportunities appeared first on Search Engine Land.


There’s a lot of talk about link building in the SEO community, and the process can be time-consuming and tedious. As the web demands higher and higher standards for the quality of content, link building is more difficult than ever.

However, few SEOs are discussing how to better utilize what they already have. There seems to be an obsession with constantly building more and more links without first understanding how that equity is currently interacting with the website. Yes, more links may help your website rank better, but your efforts may be in vain if you’re only recouping a small portion of the equity. Much of that work dedicated to link-building efforts would then be wasted.

For many websites, there is a big opportunity to improve upon the link equity that has already been established. The best part about all of this is that these issues can be addressed internally, as opposed to link building which typically requires third-party involvement. Here are some of my favorite ways to reclaim lost link value.

1. Redirect old URL paths

On client websites, I often see discontinued product pages that haven’t been redirected or entire iterations of old websites where almost all of the URLs are returning 404 errors. Leaving these pages broken leaves too much unused link equity on the table.

Finding old URL paths and 301 redirecting them can lead to huge wins in search engine visibility. In one fell swoop, you can reactivate the value of hundreds or even thousands of links that are pointing toward your domain.

So the question becomes, how can you surface these old URLs?

There are a few different methods I use, depending on the resources I have at hand. Occasionally, I’ve had clients who just went through a migration that moved their old website to a staging site. If this is the case, you should be able to configure Screaming Frog to crawl the staging environment (you may need to ignore robots.txt and crawl nofollow links). After the crawl is complete, simply export the data to a spreadsheet and use Find/Replace to swap out the staging domain with the root domain, and you should have a comprehensive list of old URL paths.

However, what if you don’t have access to any resources that list old URLs? For these situations, I use a combination of Ahrefs, Google Analytics and Google Search Console (credit to Dan Shure’s article on redirect chains, which helped me refine this process).

First, using Ahrefs, I’ll enter my domain, and then click the “Best Pages By Links” report.

From there, I export the entire report into an Excel file. It’s important that you export all of the URLs Ahrefs gives you, not just the ones it identifies as 404 errors. Ahrefs will only provide the initial status code the URL returns, which can be misleading. Often, I’ll see situations where Ahrefs identifies the status code as a 301, but the URL actually redirects to a 404.

Once I have my Excel file, I run the URLs through Screaming Frog using “List Mode” and export the 404 errors it finds into a master Excel document.

Next, I go to Google Analytics and navigate to the “Landing Pages” report. I’ll typically set the date ranges for as far back as the account tracks, but this varies for each situation. I’ll export all of the data it gives me to a spreadsheet and then add the domain name in front of the relative URL path using Excel’s CONCATENATE function.

I once again run this list through Screaming Frog and add the 404 errors it finds to the master document.

Finally, I log in to Google Search Console, open up the “Crawl Errors” report, and navigate to the “Not Found” tab. I export these URLs and confirm that they do, in fact, return 404 status codes by using Screaming Frog. I add these 404 pages to the master document.

Search Console Errors

Now there’s one master spreadsheet that contains all of the potential broken URLs in one place. De-dupe this list and run Screaming Frog in “List Mode” and export the URLs that return 404 status codes.

To help prioritize which URLs to redirect first, I connect Screaming Frog to the Ahrefs API, which will allow the crawler to gather the link metrics associated with each page. I sort that list by number of linking root domains and assign priority to the redirections that way.

After I have the final list of 404 errors, it’s simply a matter of identifying the destination pages on the client website each URL should redirect to. To scale this effort, I often use a combination of MergeWords and the OpenList Chrome extension.

2. Analyze the .htaccess file

When evaluating how your website distributes link equity, it’s important to understand how your global redirects are working as well. This is where the .htaccess file comes into play. In this file, you can see the syntax that instructs your website how to handle redirect rules.

When using a tool like Ahrefs, if I’m seeing common redirect patterns, this is a good sign that these rules are defined in the .htaccess file.

Often, I’ll see that the .htaccess file is causing 302 redirects that should be 301, pushing unnecessary redirects (causing redirect chains), or missing redirect rules that should be there. For instance, a common mistake I see are files that 302 redirect HTTP URLs to HTTPS instead of 301.

Each situation is entirely different, but here are some of the .htaccess rules I commonly look for:

  • “HTTP” to “HTTPS” rules
  • Non-WWW to WWW rules
  • URL capitalization rules
  • Trailing slash rules

There are many opportunities to better control the directives of the .htaccess file. If you’re noticing similar patterns of improperly configured redirects, it may be worth pulling this file and talking to your developers about how these issues can be fixed.

3. Fix internal 301 redirects

Now that you’ve accumulated as much link equity as possible from external sources, it’s time to ensure that your website is passing it efficiently internally. If your website has a bunch of internal 301 redirects, there’s a chance that your deeper pages may not be receiving as much link equity as they possibly could be. While Google claims there is no link equity lost in 3xx redirects, why leave this up to chance? I would rather be 100 percent sure that internal links are passing their full value throughout the website.

To identify these, I run Screaming Frog in “Spider Mode” on the domain being analyzed. Screaming Frog will crawl the website and gather instances of 301 redirects in the “Redirection (3xx)” report. If you want to determine the order of importance, sort this report by “Inlinks.” You will now see the pages that are internally 301 redirecting the most.

Often, these are instances of internal redirects in key areas such as the primary/secondary navigation, footer or sidebar links. This is great because with one change, you can eliminate a large quantity of these internal 301 redirects. While you’ll want to fix as many as possible, I recommend you start there.

Final thoughts

One thing I’ve learned during my time as an SEO is that webmasters are fantastic at diluting equity. Changes such as website migrations and previous URL redirects all have a large impact on link equity.

While in an ideal world link equity would be kept in mind during these implementations, that is often not the case. The above steps should serve as a good starting point to getting some of yours back.

The post 3 ways to improve link equity distribution and capture missed opportunities appeared first on Search Engine Land.