Pratik Dholakiya – Search Engine Land News On Search Engines, Search Engine Optimization (SEO) & Search Engine Marketing (SEM) Wed, 12 Sep 2018 17:44:02 -0400 en-US hourly 1 https://wordpress.org/?v=5.3 How to build authoritative links with data-driven content /how-to-build-authoritative-links-with-data-driven-content-299497 Wed, 06 Jun 2018 14:01:00 +0000 /?p=299497 Contributor Pratik Dholakiya shares three types of data-driven content that will improve your ability and opportunity to earn authoritative links.

The post How to build authoritative links with data-driven content appeared first on Search Engine Land.

]]>
Looking for the extra punch your content needs in order to earn links from your outreach and publishing efforts? One important solution lies in a place that may not sound exciting: data.

We all understand the importance of search engine optimization (SEO) and link building in particular. But earning links can be hard unless we understand why people link.

Most people who link to a web page are looking for something to support a claim or back up their narratives; they are looking for data to support their ideas.

Let’s look at three types of data-driven content that will dramatically improve your ability and opportunity to earn authoritative links.

1. Data analysis

Content-based data analysis might sound dry, but the reality is that much of the best-performing, link-attracting content on the web is the result of data analysis.

Content data analysis includes original research, such as correlation studies and other applications of statistical techniques to data. The data itself may be publicly accessible, or it may be obtained through more proprietary means.

As an SEO, you may be familiar with ranking correlation studies conducted by various companies like SEMrush and Ahrefs, which have earned each company many backlinks.

In the case of these ranking correlation studies:

  • They used proprietary tools to crawl the search engine results for a large list of keywords.
  • They crawled various on-page and off-page metrics associated with the pages in the search results.
  • They reported correlations between these metrics and rankings in the search results.

Correlation measures how often two factors go together, on a scale from -1 to 1 (or sometimes -100% to 100%), but it can’t tell you whether one factor causes the other, if they share a common cause, or if “chance” is responsible for the correlation. Even so, correlations are a good place to start if you are looking for possible causal connections to look into.

For that reason and more, people in the SEO community find these kinds of correlation studies very useful.

But analysis of this kind is useful outside of the SEO community as well.

One site that has produced a lot of this type of content is the dating site okcupid.com.

For example, one of their posts was an analysis of how people rated the attractiveness of people on the dating site. There were surprising revelations, such as camera flashes adding seven years to the apparent age of the person in the photo:

As of this writing, this post has links from over 300 domains, suggesting just how powerful data analysis-based content can be, even in the consumer market.

But you don’t necessarily need to have your own proprietary data to create this kind of content. Consider analyzing the publicly available data provided from sources such as:

Cross-referencing these data sets against each other to find correlations can be an especially interesting way to identify novel information worth discussing in a data-driven piece of content.

2. Surveys

A highly popular post on Search Engine Land was a survey conducted by BrightLocal entitled “88% Of Consumers Trust Online Reviews As Much As Personal Recommendations.”

This survey was based on answers from 2,104 recipients, and it earned over 140 backlinks.

Surveys like these tend to make their way through the press and pick up a lot of links, provided the topic of the survey is original, relevant enough to be newsworthy and interesting to your core audience.

Here’s how to get started creating a survey like this and putting together the appropriate content to match:

  1. Identify a question that your target audience, or their influencers, would be interested in knowing the answer to. It needs to be a quantitative question and if covered before, enough time should have passed that the previous studies are out of date.
  2. You can search forums and Q&A sites for questions, with an eye for those that don’t have any satisfactory quantitative answers.
  3. Choose a platform for your survey, such as Google Surveys or SurveyMonkey.
  4. Keep the survey as short as possible or you are likely to get fewer or less accurate responses.
  5. Your questions should not be open-ended, and the answer options should be informative.
  6. After getting your results, identify the most eye-catching piece of information in a quantifiable form, and make that your headline. The content should be built around the headline, since this is what most people will see and share.
  7. Flesh out your content using the results of the survey but be sure to contextualize by providing your expert interpretations.
  8. Reference previous research by yourself and others, and refer to examples or your own experience.

3. Research compilations

A research compilation is simply a post that compiles previous data-focused research and uses it to create a comprehensive overview of a topic.

While an individual research compilation won’t necessarily capture as much press as a piece of original research,  research compilations can be created more easily and consistently.

Since they collect results from a wide range of sources and address a topic more comprehensively, they are often more evergreen and can continue to pick up links over the long term.

They can also be regularly updated as new information becomes available.

The image above is from a post at HubSpot that has earned almost 2,000 links. The post is a massive roundup of bullet points, each of them sharing a quantitative fact from a previous study. The facts are organized into categories and use a huge number of resources.

Massive lists like these aren’t the only way to do research compilations, however.

Consider the Search Engine Land “What Is SEO?” guide. It, too, has picked up a massive number of links and is the second most linked content on this site.  It introduces people to SEO and is reputably sourced throughout.

While the Search Engine Land guide doesn’t fall within the most strict definition of a “data-driven” post, it is very much a compilation of prior research and an alternative example of how to approach this type of content.

Despite the variety of ways in which a research compilation can be approached, here are some commonalities you’ll find in most successful examples:

  • The content is more comprehensive and useful. It one-ups any previous content on the same topic by providing all of the value that content provides, plus a little more.
  • They are heavily sourced. The information is not being presented out of thin air, and the sources are authoritative. The content is factually dense, with a limited amount of elaboration, and only enough interpretation and context to maximize practical use in the shortest amount of time possible.
  • They are evergreen. They are designed to be referred back to often, bookmarked and revisited regularly. While they will be useful on their first visit, they contain enough information that it’s not possible to absorb all of the information in one go.
  • They are well structured. Despite citing a large number of references, these posts are still designed to work as a cohesive whole by incorporating clear and easy to navigate categorization, an order in which to absorb the information, or both.
  • They frequently include some data visualization. To avoid becoming a monotonous stream of facts, the content uses data visualization or may even be an infographic.
  • Attention is drawn toward them in the structure of the site itself. They are featured readily in the main navigation, and calls to action may exist elsewhere on the site to point them out.

Conclusion

Data-driven content performs well because hard facts and numbers give your words weight in a way that isn’t otherwise possible.  The concrete nature of data-based arguments and advice captures attention and makes you a source worth citing.

Data analysis allows you to find interesting connections between phenomena that pose meaningful implications for your readers. Surveys keep your audience in touch with the zeitgeist, and research compilations produce excellent, evergreen content that they will refer back to time and time again.

The post How to build authoritative links with data-driven content appeared first on Search Engine Land.

]]>
The 40-point SEO checklist for startups /the-40-point-seo-checklist-for-startups-296197 Wed, 18 Apr 2018 14:29:00 +0000 /?p=296197 Startups can't afford to miss a SEO trick when it comes to launching a new site, says Contributor Pratik Dholakiya. Here's a checklist to help keep you on track.

The post The 40-point SEO checklist for startups appeared first on Search Engine Land.

]]>

Whether you are in the process of taking your startup site public or honing your on-site search engine optimization (SEO) post-launch, it’s important to have a process in place to make sure you aren’t missing anything.

To that end, we’ve collected 40 factors we recommend incorporating into your checklists and processes to ensure that your SEO stays ahead of the game.

The following checklist takes into account SEO factors related to your:

  • Server setup.
  • Indexation.
  • Technical content factors.
  • Site architecture.
  • Mobile factors.

Keep this on hand the next time you need to evaluate your site.

Server-side SEO

Why SEO Services Are Not A One-Off Activity

During the process of developing a website for your startup, you will need to make sure you have your server and hosting issues covered. Here are some considerations to watch out for leading up and after your launch.

1. Monitor site uptime: Use a free uptime monitoring tool such as Pingdom or UptimeRobot to verify that your site’s uptime is reasonable. In general, you should aim for an uptime of 99.999 percent. Dropping to 99.9 percent is sketchy, and falling to 99 percent is completely unacceptable. Look for web host uptime guarantees, how they will compensate you when those guarantees are broken, and hold them to their word with monitoring tools.

2. Switch to HTTPS: Set up HTTPS as early as possible in the process. The later you do this, the more difficult the migration will be. Verify that hypertext transfer protocol (HTTP) always redirects to hypertext transfer protocol secure (HTTPS), and that this never leads to a 404 page. Run a secure sockets layer (SSL) test to ensure your setup is secure.

3. Single URL format: In addition to making sure HTTP always redirects to HTTPS, ensure the www or non-www uniform resource locator (URL) version is used exclusively, and that the alternative always redirects. Ensure this is the case for both HTTP and HTTPS and that all links use the proper URL format and do not redirect.

4. Check your IP neighbors: If your internet protocol (IP) neighbors are showing webspam patterns, Google’s spam filters may have a higher sensitivity for your site. Use an IP neighborhood tool (also known as a network neighbor tool) to take a look at a sample of the sites in your neighborhood and look for any signs of spam. We are talking about outright spam here, not low-quality content. It is a good idea to run this tool on a few reputable sites to get an idea of what to expect from a normal site before jumping to any conclusions.

5. Check for malware: Use Google’s free tool to check for malware on your site.

6. Check for DNS issues: Use a DNS check tool such as the one provided by Pingdom or Mxtoolbox to identify any DNS issues that might cause problems. Talk to your webhost about any issues you come across here.

7. Check for server errors: Crawl your site with a tool such as Screaming Frog. You should not find any 301 or 302 redirects, because if you do, it means that you are linking to URLs that redirect. Update any links that redirect. Prioritize removing links to any 404 or 5xx pages, since these pages don’t exist at all, or are broken. Block 403 (forbidden) pages with robots.txt.

8. Check for noindexing and nofollow: Once your site is public, use a crawler to verify that no pages are unintentionally noindexed and that no pages or links are nofollowed at all. The noindex tag tells search engines not to put the page in the search index, which should only be done for duplicate content and content you don’t want to show up in search results. The nofollow tag tells search engines not to pass PageRank from the page, which you should never do to your own content.

9. Eliminate Soft 404s: Test a nonexistent URL in a crawler such as Screaming Frog. If the page does not show as 404, this is a problem. Google wants nonexistent pages to render as 404 pages; you just shouldn’t link to nonexistent pages.

Indexing

Run your site through the following points both before and after your startup goes live to ensure that pages get added to the search index quickly.

1. Sitemaps: Verify that an eXtensible markup language (XML) sitemap is located at example.com/sitemap.xml and that the sitemap has been uploaded to the Google Search Console and Bing Webmaster Tools. The sitemap should be dynamic and updated whenever a new page is added. The sitemap must use the appropriate URL structure (HTTP versus HTTPS and www versus non-www) and this must be consistent. Verify the sitemap returns only status 200 pages. You don’t want any 404s or 301s here. Use the World Wide Web Consortium (W3C) validator to ensure that the sitemap code validates properly.

2. Google cache: See Google’s cache of your site using an URL like:

 http://webcache.googleusercontent.com/search?q=cache:[your URL here].

This will show you how Google sees your site. Navigate the cache to see if any important elements are missing from any of your page templates.

3. Indexed pages: Google site:example.com to see if the total number of returned results matches your database. If the number is low, it means some pages are not being indexed, and these should be accounted for. If the number is high, it means that duplicate content issues need to be alleviated. While this number is rarely 100 percent identical, any large discrepancy should be addressed.

4. RSS feeds: While rich site summary (RSS) feeds are no longer widely used by the general population, RSS feeds are often used by crawlers and can pick up additional links, useful primarily for indexing. Include a rel=alternate to indicate your RSS feed in the source code, and verify that your RSS feed functions properly with a reader.

5. Social media posting: Use an automatic social media poster, like Social Media Auto Publish for WordPress, for your blog or any section of your site that is regularly updated, as long as the content in that section is a good fit for social media. Publication to social media leads to exposure, obviously, but also helps with ensuring your pages get indexed in the search results.

6. Rich snippets: If you are using semantic markup, verify that the rich snippets are showing properly and that they are not broken. If either is the case, validate your markup to ensure there are no errors. It is possible that Google simply won’t show the rich snippets anyway, but if they are missing, it is important to verify that errors aren’t responsible.

Content

Put processes in place to ensure that the following issues are handled with each new piece of content you plan to create post-launch, and check each of these points on your site before you launch.

1. Missing titles: Use a crawler to verify that every page on your site has a title tag.

2. Title length: If you are using Screaming Frog, sort your titles by pixel length and identify the length at which your titles are getting cut off in the search results. While it is not always necessary to reduce the title length below this value, it is vital that all the information a user needs to identify the subject of the page shows up before the cutoff point. Note any especially short titles as well, since they should likely be expanded to target more long-tail search queries.

3. Title keywords: Ensure that any primary keywords you are targeting with a piece of content are present in the title tag. Do not repeat keyword variations in the title tag, consider synonyms if they are not awkward, and place the most important keywords closest to the beginning if it is not awkward. Remember that keyword use should rarely trump the importance of an appealing title.

4. Meta descriptions: Crawl your site to ensure that you are aware of all missing meta descriptions. It is a misconception that every page needs a meta description, since there are some cases where Google’s automated snipped is actually better, such as for pages targeting long-tail. However, the choice between a missing meta description and a present one should always be deliberate. Identify and remove any duplicate meta descriptions. These are always bad. Verify that your meta descriptions are shorter than 160 characters so that they don’t get cut off. Include key phrases naturally in your meta descriptions so that they show up in bold in the snippet. (Note that 160 characters is a guideline only, and that both Bing and Google currently use dynamic, pixel-based upper limits.)

5. H1 headers: Ensure that all pages use a header 1 (H1) tag, that there are no duplicate H1 tags, and that there is only one H1 tag for each page. Your H1 tag should be treated similarly to the title tag, with the exception that it doesn’t have any maximum length (although you shouldn’t abuse the length). It is a misconception that your H1 tag needs to be identical to your title tag, although it should obviously be related. In the case of a blog post, most users will expect the header and title tag to be the same or nearly identical. But in the case of a landing page, users may expect the title tag to be a call to action and the header to be a greeting.

6. H2 and other headers: Crawl your site and check for missing H2 headers. These subheadings aren’t always necessary, but pages without them may be walls of text that are difficult for users to parse. Any page with more than three short paragraphs of text should probably use an H2 tag. Verify that H3, H4, and so on are being used for further subheadings. Primary subheadings should always be H2.

7. Keywords: Does every piece of content have a target keyword? Any content that does not currently have an official keyword assigned to it will need some keyword research applied.

8. Alt text: Non-decorative images should always use alt-text to identify the content of the image. Use keywords that identify the image itself, not the rest of the content. Bear in mind that the alt-text is intended as a genuine alternative to the image, used by visually impaired users and browsers that cannot render the image. The alt-text should always make sense to a human user. Bear in mind that alt-text is not for decorative images like borders, only for images that serve a use as content or interface.

Site architecture

It’s always best to get site architecture handled as early on in the launch process as possible, but these are important considerations you need to take into account even if you have already launched.

1. Logo links: Verify that the logo in your top menu links back to the homepage, and that this is the case for every section of your site, including the blog. If the blog is its own mini-brand and the logo links back to the homepage of the blog, ensure that there is a prominent homepage link in the top navigation.

2. Navigational anchor text: Your navigational anchor text should employ words for your target keyword phrases, but should be short enough to work for navigation. Avoid menus with long anchor text, and avoid repetitious phrasing in your anchor text. For example, a dropdown menu should not list “Fuji apples, Golden Delicious apples, Granny Smith apples, Gala apples” and so on. Instead, the top menu category should be “Apples,” and the dropdown should just list the apples by type.

3. External links: Links to other sites in your main navigation, or otherwise listed on every page, can be interpreted as a spam signal by the search engines. While sitewide external links aren’t necessarily a violation of Google’s policies on link schemes, they can resemble the “Low quality directory or bookmark site links,” and Google explicitly calls out “Widely distributed links in the footers or templates of various sites.” It’s also crucial that any sponsored links use a nofollow attribute and a very good idea to nofollow your comment sections and other user-generated content.

4. Orphan pages: Cross reference your crawl data with your database to ensure that there are no orphan pages. An orphan page is a URL that is not reachable from any links on your site. Note that this is different from a 404 page, which simply does not exist but may have links pointing to it. Aside from these pages receiving no link equity from your site, they are unlikely to rank.  Orphan pages can also be considered “doorway pages” that may be interpreted as spam. If you do not have access to database information, cross reference crawl data with Google Analytics.

5. Subfolders: URL subfolders should follow a logical hierarchy that matches the navigational hierarchy of the site. Each page should have only one URL, meaning that it should never belong to more than one contradicting category or subcategory. If this is unfeasible for one reason or another, ensure that canonicalization is used to indicate which version should be indexed.

6. Link depth:  Important pages, such as those targeting top keywords, should not be more than two levels deep, and should ideally be reachable directly from the homepage.  You can check for link depth in Screaming Frog with “Crawl depth.” This is the number of clicks away from the page you enter as the start of your crawl.

7. Hierarchy: While pages should be accessible from the homepage within a small number of clicks, this does not mean that your site should have a completely flat architecture. Unless your site is very small, you don’t want to be able to reach every page directly from the homepage. Instead, your main categories should be reachable from the homepage, and each subsequent page should be reachable from those category pages, followed by subcategories, and so on.

8. No JavaScript pagination: Every individual piece of content should have an individual URL. At no point should a user be able to navigate to a page without changing the browser URL. In addition to making indexation very difficult or impossible for search engines, this also makes it impossible for users to link directly to a page they found useful.

9. URL variables: URL variables such as “?sort=ascending” should not be tacked onto the end of URLs that are indexed in the search engines, because they create duplicate content. Pages containing URL variables should always canonicalize to pages without them.

10. Contextual linking: Google has stated editorial links embedded in the content count more than links within the navigation. Best practice suggests adding descriptive text around the link,  your site’s internal links will pass more value if you include contextual links. In other words, internal linking within the main body content of the page is important, particularly for blog and editorial content. Even product pages should ideally have recommendation links for similar products.

Mobile

Virtually every modern startup should start right off the bat with a mobile-friendly interface and infrastructure. Check for and implement the following as early on as possible.

1. Google Mobile-friendly test: Run the Google Mobile-friendly test to identify any issues that Google specifically finds with how users will experience your site on mobile.

2. Implement responsible design: Your site should be responsive, meaning that it will function properly and look aesthetic to users no matter what device they are accessing your site from. If this is outside your wheelhouse, look for a theme labeled “responsive template.” Responsive themes are available for nearly all platforms, and some free options are almost always available. Be sure to eliminate any extraneous visual elements that are unnecessary to see from a mobile device. Use in your CSS to block these elements.media rules.

3. JavaScript and Flash: Verify that your pages work fine without JavaScript or Flash. Use your crawler or database to identify pages that reference small web format (.swf) and  JavaScript (.js) files and visit these pages using a browser with JavaScript disabled and no Adobe Flash installed. If these pages are not fully functiona,l they will need to be reworked. Flash in general should be entirely replaced with cascading style sheets (CSS). JavaScript should only be used to dynamically alter HTML elements that are still functional in the absence of JavaScript.

4. Responsive navigation: Verify that your drop-down menus are functional on mobile devices and that the text width doesn’t make them unattractive or difficult to use.

5. Responsive images: Even some responsive themes can lose their responsiveness when large images are introduced. For example, placing the following code between your tags will ensure that images size down if the browser window is too small for the image:

img {
width: 100%;
height: auto;
}

6. Responsive videos and embeds: Videos, and especially embeds, can really bungle up responsive themes. For example, if you are using the HTML video tag, placing this code between your tags will cause your videos to scale down with the browser window:

video {
max-width: 100%;
height: auto;
}

7. Interstitials and pop-ups: Verify that any pop-ups or interstitials you use are fully compatible with any device, and consider blocking them for devices below a certain pixel width. It should always be possible to close out of an interstitial or pop-up, and button sizes should always be reasonable for a touch interface.

Conclusion

As important as innovation and personalized strategy are to an effective launch, without a foundation of processes to ensure the basics are taken care of, it can become easy to chase new SEO trends and neglect what we know works. As you develop your startup launch strategy and followup SEO work, refer to this checklist and build your own processes to ensure that this doesn’t happen.

The post The 40-point SEO checklist for startups appeared first on Search Engine Land.

]]>
4 concrete ways to use images to build links /4-concrete-ways-use-images-build-links-293871 Wed, 14 Mar 2018 16:38:10 +0000 /?p=293871 Every picture tells a story and also may help you build links. Contributor Pratik Dholakiya shares four solid ways to use images to attract links.

The post 4 concrete ways to use images to build links appeared first on Search Engine Land.

]]>

“Create visual content and the links will follow” is a nice sentiment, but in reality, it’s a prerequisite, not a guarantee of the fulfillment of a promise.

If you want to use images to earn inbound links, you need a concrete plan with some specific actionable goals.

Here are four ways you can use images and visual content to build links and drive traffic. Use the following tactics to get the ideas and inbound links flowing and build a smart strategy for your brand.

1. Become your industry’s stock photo site

It’s become more or less a standard in this industry to ensure that every blog post needs to feature at least one image to keep people engaged and be taken seriously, with a few exceptions.

In many cases, those images are stock photos with some thematic connection to the topic of the post, rather than original image content.

There’s nothing inherently wrong with using stock images, but you can take advantage of it by becoming a go-to resource in your industry for visual content.

Here are some ideas to help you do that:

  • Make a list of niche keywords in your industry, and perform an image search on Google to see if there is a lack of good images out there.
  • Create images that represent something insightful about those keywords and their related topics. This could be in the form of original journalistic photographs, data visualizations such as infographics or visual metaphors.
  • Create a blog post around your visual content and include an embed code to make it easy for people to reuse the image with credits. Look for an “embed code generator” tool to help create embedded code.
  • Create a “stock photo” page on your site that collects all of your original images, along with embed codes. The title of the page should include those phrases bloggers use when searching for images, such as “free stock photos,” “public domain images,” “creative commons images” or similar phrases, as well as the relevant niche terms. Make sure to include image alts and image labels in text for the more specific keywords. Include your embed codes here as well to make sure it’s easy for people to link to you with credit.

Bear in mind that your visual content doesn’t necessarily need to be the most amazing thing ever, as long as it addresses topics that aren’t as readily addressed in other images.

Examples of this are the top image results for Moz. Their Whiteboard Friday images lack visual flair, but they get the point across.

2. Identify image keywords bloggers are likely to search for

This is related to the tactic above, but it’s a topic with enough depth that it deserves its own section.

The goal here isn’t just to identify keywords your consumer audience is searching for, or even keywords that other influencers in your industry are searching for.

You need to specifically identify keywords that bloggers and influencers are using images for and linking to.

Start by scraping a few prominent sites in your niche and looking for patterns. Here is one approach I recommend using:

  • Use Screaming Frog to crawl a top publisher with an audience similar to yours.
  • Go to the “External” tab and select “Images” from the filter.
  • Export the image links and analyze the image alt text for any patterns.

Unfortunately, most publishers these days don’t use external links to display images; instead, they host the image on their own site, with an image credit link. Since these links aren’t embedded in the same hypertext markup language (HTML) as the image itself, there’s no easy way to identify image credit links.

What you can do, however, is crawl the site for their internal images and analyze the image alts they are using for some ideas:

While you won’t be able to immediately tell which images are credited to other sources and which were produced internally, you can quickly determine what topics their visual content tends to focus on.

You can also do a crawl of all external links and export the anchor text:

While this won’t limit the external links to image credits, it will help you identify the kind of topics they are most willing to link out to. Combining that with your image alt data and some manual inspection, you can start to get a clear idea of what kinds of keywords to target with your images.

Repeat this process for several top publishers until you have a clear, extensive list of keywords to target, with your original images.

Now test the viability of your keywords by:

  • Testing the keyword volume in the Google Keyword Planner. You don’t need a lot of volume, since the keywords you are focusing on should be keywords searched for by bloggers, not general audiences. But you will need to make sure enough people are searching for the topic that bloggers would regularly come across the image.
  • Search for the keyword with Google image search to see what comes up. Image quality is a big factor, but relevance is even more important. What you are really shooting for is a keyword without a good image designed to convey the idea clearly. As long as you go tight enough with your niche, this is more common than you might think.
  • Avoid generic keywords. Generic keywords should be a jumping-off point only. You should be looking for highly specific keywords that convey very clear concepts that can be presented visually.
  • Use a tool such as SEMrush to estimate the difficulty of ranking for the keyword.

3. Reach out to people using your original images

If you are creating original visual content and publishing it to your site, and you have a decent amount of exposure in Google Images, there is a very good chance people are using your images without linking to you.

Capitalize on this by contacting these people and politely asking them to give you credit with a link. (In all but the most egregious monetized cases, I would avoid making copyright threats, especially since it is more likely to result in their removing the image than linking to you for credit.)

To find sites that are using your image, go to Google Images and click the camera icon:

You’ll be asked to paste an image URL or to upload an image:

Now, paste the image URL (pointing to the image itself, not the page it’s on) into the “Paste image URL” tab, or click “Upload an image” and browse through your folders to locate the image if you are storing it locally on your machine. You can also just drag and drop an image into this pop-up.

Then click “Search by image.”

Scroll past the “Best guess for this image” and “Visually Similar Images” results, down to the “Pages that include matching images.” Click through to verify that they are still using the image, find their contact information, and send them an email requesting they cite your image with a link.

If you are producing a lot of image content on a regular basis, this process can get tedious, so it’s better off being automated. In that case, you can use the sites that allow you to do “reverse image search” for a larger number of images on a periodic basis.

4. Perfect your image-to-word ratio

According to a study by BuzzSumo, the blog posts that receive the most shares on Facebook and Twitter are the ones that include one image for every roughly 75 to 100 words.

BuzzSumo Report

Since there’s a relatively strong correlation between social sharing and the number of inbound links you earn, getting the right mix of images and words can be a smart link-earning strategy.

As with any statistic, especially one based on observational analysis instead of experimentation, it should be taken with a grain of salt. Rather than considering this “best practice,” use it as a jumping-off point, test a few different ratios over time and measure what seems to work best within your niche.

In most niches, the more hardcore your fan base, the more knowledge-hungry they are, meaning that they will be more willing to read walls of text (although you’d better be leveraging your white space even if that’s the case).

You may also find that your link-earning and social media activity aren’t as heavily correlated in your industry.

Regardless, the point stands. Measuring your image-to-word ratio — and how it correlates with the number of inbound links you earn — will help inform your link-earning strategy and allow you to make more optimized decisions.

Now, it’s time to put these ideas to use and up your visual SEO game!

The post 4 concrete ways to use images to build links appeared first on Search Engine Land.

]]>
SEO trends and Google changes to expect in 2018 /seo-trends-google-changes-expect-2018-289425 Thu, 11 Jan 2018 17:04:19 +0000 /?p=289425 Columnist Pratik Dholakiya explores current search trends and speculates on where the industry might be headed in 2018.

The post SEO trends and Google changes to expect in 2018 appeared first on Search Engine Land.

]]>

We’re already over a week into 2018, and the start of a new year is a great time to check in and see where we stand as an industry — and how things might change this year.

Prepare for fake news algorithm updates

Back in 2010, Google was getting beaten up in the media for the increasing amount of “content farm” clutter in the search results. That negative press was so overwhelming that Google felt it had no choice but to respond:

[We] hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content.

Soon after that, in February 2011, the Google Panda update was released, which specifically targeted spammy and low-quality content.

Why do I bring this up today? Because the media has been hammering Google for promoting fake news for the past year and a half — a problem so extensive that search industry expert Danny Sullivan has referred to it as “Google’s biggest-ever search quality crisis.”

Needless to say, these accusations are hurting Google’s image in ways that cut far deeper than content farms. While the problem of rooting out false information is a difficult one, it is one that Google has a great deal of motivation to solve.

Google has already taken action to combat the issue in response to the negative press, including banning publishers who were promoting fake news ads, testing new ways for users to report offensive autocomplete suggestions, adjusting their algorithm to devalue “non-authoritative information” (such as Holocaust denial sites), and adding “fact check” tags to search results.

Of course, the issue of trustworthy search results has been on Google’s radar for years. In 2015, researchers from Google released a paper on Knowledge-Based Trust (KBT), a way of evaluating the quality of web pages based on their factual accuracy rather than the number of inbound links. If implemented, the Knowledge-Based Trust system would ultimately demote sites that repeatedly publish fake news (although there is a potential for it to go wrong if the incorrect facts become widely circulated).

Whether the Knowledge-Based Trust method is enough to combat fake news — or if some version of it has already been implemented without success — is difficult to say. But, it’s clear that Google is interested in making truthfulness a ranking factor, and they’ve never had a stronger motivation to do so than now.

Voice search and featured snippets will grow hand-in-hand

One in five mobile search queries currently comes from voice search — a number that is likely to rise as Google Assistant-enabled devices such as Google Home continue to grow in popularity. And as voice search grows, we can expect to see an increase in featured snippets, from which Google often sources its voice search results.

Indeed, there is already evidence that this growth is taking place. A study released by Stone Temple Consulting last year confirmed that featured snippets are on the rise, appearing for roughly 30 percent of the 1.4 million queries they tested.

If this trend continues, featured snippets may even begin to rival the top organic listing as the place to be if you want to get noticed. (For more on featured snippets and how to target them, check out Stephan Spencer’s excellent primer on the subject.)

Artificial intelligence (AI) will power many more aspects of search

It’s now been over two years since we were first introduced to RankBrain, Google’s machine-learning AI system which helps to process its search results. Since its introduction, it’s gone from handling 15 percent of search queries to all of them.

Google’s interest in AI extends much further than RankBrain, however. They have developed the Cloud Vision API, which is capable of recognizing an enormous number of objects. Indeed, Google has so much machine-learning capacity that they are now selling it as its own product.

But perhaps most interestingly, Google has now built an AI that is better at building AI than humans are. This was a project by Google Brain, a team that specializes specifically in building AI for Google.

Unfortunately, AI is not without its issues. AIs tend to get stuck in local minima, where they arrive at a “good enough” solution and are unable to climb out of it in order to discover a better solution. They also have a tendency to confuse correlation with causation; one might even call them “superstitious” in that they draw connections between unrelated things. And since the developers only program the machine-learning algorithm, they themselves don’t understand how the final algorithm works, and as a result, have even more difficulty predicting how it will behave than in the case of traditional programs.

As Google continues to embrace AI and incorporate more of it into their search algorithms, we can expect search results to start behaving in less predictable ways. This will not always be a good thing, but it is something we should be prepared for.

AI doesn’t change much in the way of long-term SEO strategies. Optimizing for AI is essentially optimizing for humans, since the goal of a machine-learning algorithm is to make predictions similar to those of humans.

Manipulative guest posting is likely to take a hit

In May, Google warned webmasters that using article marketing as a large-scale link-building tactic is against its guidelines and could result in a penalty. Since this is already well known in the SEO community, Google’s announcement likely signals that an algorithm update targeting manipulative guest posting is on the horizon.

What counts as manipulative guest posting? To me, the most vital piece of information from Google’s guidelines has always been the recommendation to ask yourself, “Does this help my users? Would I do this if search engines didn’t exist?”

Guest posts that don’t expand brand awareness or send referral traffic aren’t worth doing, except for the possibility that they will positively impact your search engine rankings. The irony of taking that approach is that it isn’t likely to work well for your search engine rankings either — at least not in the long term.

I’m not saying anything that isn’t common knowledge in the SEO community, but I have a feeling that a lot of people in this industry are fooling themselves. All too often, I see marketers pursuing unsustainable guest posting practices and telling themselves that what they are doing is legitimate. That is what a lot of people were telling themselves about article marketing on sites like EzineArticles back in the day, too.

‘Linkless’ mentions

Bing has confirmed that they track unlinked brand mentions and use them as a ranking signal — and a patent by Google (along with observations from many SEO experts) indicates that Google may be doing this as well.

As AI begins to play a bigger part in rankings, it’s not unreasonable to expect “linkless” mentions of this type to start playing a bigger role in search rankings.

The tactics used to earn brand mentions are, of course, not much different from the tactics used to earn links, but since the number of people who mention brands is much higher than the number of people who link to them, this could provide a good boost for smaller brands that fall below the threshold of earning press.

This highlights the importance of being involved in conversations on the web, and the importance of inciting those conversations yourself.

An interstitial crackdown may be on the way

The early 2017 mobile interstitial penalty update was a sign of Google’s continued battle against intrusive mobile ads. The hardest hit sites had aggressive advertising that blocked users from taking action, deceptive advertising placement and/or other issues that hindered use of the interface.

However, columnist and SEO expert Glenn Gabe noted that the impact of this penalty seemed… underwhelming. Big brands still seem to be getting away with interstitial ads, but Google may decide to crack down on these in the near future. The crucial factor seems to be the amount of trust big brands have accumulated in other ways. How all of this shakes out ultimately depends on how Google will reward branding vs. intrusive advertising.

Mobile-first indexing

It’s been nearly three years since Google announced that mobile searches had finally surpassed desktop searches on its search engine — and just last year, BrightEdge found that 57 percent of traffic among its clients came from mobile devices.

Google is responding to this shift in user behavior with mobile-first indexing, which means “Google will create and rank its search listings based on the mobile version of content, even for listings that are shown to desktop users.” Representatives from Google have stated that we can expect the mobile-first index to launch this year.

In other words, 2018 very well may be the year where signals that used to only impact searches from mobile devices become signals that impact all searches. Sites that fail to work on a mobile device may soon become obsolete.

Be prepared for this year

Google has come a long way since it first hit the scene in the late 1990s. The prevalence of AI, the political climate and efforts and warnings against manipulative guest posts and distracting advertisements, all signal that change is coming. Focus on long-term SEO strategies that will keep you competitive in the year ahead.

The post SEO trends and Google changes to expect in 2018 appeared first on Search Engine Land.

]]>
How independent reviews influence Google’s trust in your brand /independent-reviews-influence-googles-trust-brand-288301 Wed, 20 Dec 2017 15:21:24 +0000 /?p=288301 Cultivating user reviews is an integral part of any search strategy, especially for local businesses. Columnist Pratik Dholakiya discusses the impact of reviews and provides tips for where to focus your efforts.

The post How independent reviews influence Google’s trust in your brand appeared first on Search Engine Land.

]]>

Search Engine Land columnist Kevin Lee recently wrote a post about the prevalence of fake reviews, how they are damaging consumer trust and why it’s a bad move with permanent repercussions to attempt to use them yourself.

The reason for this growing problem is that online reviews have tremendous influence over the purchasing decisions of consumers, as well as the performance of brands in the search engines. Luckily, many major review sites — including Google, Amazon and Yelp — are taking steps to combat the issue.

With all of this in mind, now is a good time to address how to approach online reviews in an ethical way that will produce long-lasting, positive results for brand perception and search engine traffic.

Google associates trust with ratings and reviews

It’s important to establish the relationship between user reviews and SEO performance before moving forward. Understanding that relationship will inform how to best approach and build a strategy for earning reviews.

A recent study affirmed the strong correlation between ratings and search engine performance for local businesses. The study was conducted by LocalSEO Guide and worked in cooperation with the University Of California, Irvine as well as PlacesScout. It analyzed the correlation between over 200 potential search engine factors and rankings for over 100,000 local businesses.

Specifically, the study found that if a keyword is found in reviews of your business, or if your location is mentioned in a review, those enhance your rankings in the search results.

Do reviews enhance your performance in general search results, outside of local search?

That is a bit more contentious. Google itself has stated that star ratings in AdWords enhance click-through by up to 17 percent, and a study by BrightLocal has found that organic listings with 4- and 5-star ratings (in the form of rich snippets) enjoy a slightly higher click-through rate than listings with no stars. While there’s never been a formal confirmation, there is a great deal of evidence to suggest that higher click-through rates (CTR) may indirectly enhance your rankings in the search results.

Even if reviews don’t directly impact search rankings, the fact that they enhance click-through rates may potentially affect your rankings in an indirect fashion. And increased CTR is a benefit in itself!

User-generated content and reviews also heavily influence consumer decisions. A study by TurnTo found that user-generated content influenced buyers’ decisions more than any other factor looked at in the study, including search engines themselves.

The fastest way to success

Google has made it easy for you to get your customers to review you, and this is the very first thing you should start with.

Find your PlaceID using the lookup tool that Google has provided here. Put your business name in the “Enter a location” search bar. Click on your business name when it appears, then your PlaceID will pop up underneath your business name.

Copy the PlaceID and paste it to the end of this URL: https://search.google.com/local/writereview?placeid=

For example, the Macy’s location listed above would have the following review URL:

https://search.google.com/local/writereview?placeid=ChIJ3xjWra5ZwokRrwJ0KZ4yKNs

Now, try that URL in your browser with your business’s PlaceID to test whether it works or not. It should take you to a search result for your business with a “Rate and review” pop-up window.

Share this URL with your customers after transactions to pick up reviews on your Google My Business account.

While the Google My Business reviews are likely to have the largest impact on search engine rankings, they are not the only reviews Google takes into consideration, and it is in your best interest to pick up reviews from third-party sites as well. Third-party review sites can help you pick up more reviews more quickly, and they add diversity to your review profile, which enhances your legitimacy. This, in turn, imbues the reviews with greater authority.

In addition to boosting the authority and diversity of your reviews, third-party review sites help in a few other ways. Many are designed to make it simple to request reviews from your customers in an organized way. (Though be advised that some, like Yelp, discourage review solicitation.)

6 more tactics for picking up reviews

If you want to take things further, listed below are a few more tactics for you to consider working into your review strategy:

  1. Identify any industry-specific review sites: Reviews from industry-specific sites (think Avvo for lawyers or ZocDoc for doctors) can be huge, especially if you know that your potential customers are using these sites. It’s important to identify which vertical review sites may be relevant to you and to devise a strategy for earning positive reviews on these sites.
  2. Seek reviews from product bloggers: While blogger reviews are an entirely different ballgame from user reviews, they are equally important. Links from trusted bloggers are a strong signal that can positively affect your search engine rankings, and if the bloggers have audiences who trust the reviewer’s opinion, their reviews can earn you referral traffic with conversion rates not achievable from most sources. Just be sure that the blogger discloses any financial arrangement you might have with them.
  3. Respond to your reviewers: So long as you handle it tactfully, responding to reviewers (including and perhaps especially negative ones) can have a tremendously positive impact on brand perception, as it shows that you care about your customers. The important thing to remember about responding to reviews is that your response is not only for the customer but also for anybody else who sees the interaction. How you treat that review is how they will expect to be treated.
  4. Contact your happiest customers: It goes without saying that the happiest customers are the ones most likely to leave a positive review. Tactfully encouraging these customers to leave reviews is an important move if you want people to perceive you in a positive light. (Just be sure that you understand each site’s review solicitation guidelines.)
  5. Use social media for customer support: While social media shouldn’t replace a customer support team, many consumers see social media as a place to solve any problem they are having with their product. Many also use social media as a place to complain, often without even trying to contact your business. Be prepared for this, and respond to any mentions of your brand on social media with an offer to help. Don’t make the mistake of asking them to talk to you and take the conversation offline. Keep it online and portray yourself in the best way possible.
  6. Ask the right questions: Whatever media you are using to encourage your customers to leave a review, it’s important to make sure you are asking the right questions. Asking them simply to let people know if they liked the product typically isn’t the way to go, since it leads to very generic reviews. Ask more specific, pointed questions about how the product helped them solve a particular problem. These are the kind of stories that encourage people to purchase a product.

Conclusion

Online reviews play an incredibly important part in a buyer’s journey, from interest to purchase. They have a heavy influence on rankings in local search results and play an important part in more traditional search engine performance as well.

Brick-and-mortar businesses should use thank-you emails and other customer communications to point consumers to their Google My Business pages. Take advantage of third-party review sites to easily encourage reviews. Reach out to your customers and online influencers to improve coverage of your products.

Do not neglect these efforts. User reviews influence modern purchasers heavily. If your product is strong, your efforts will pay dividends.

The post How independent reviews influence Google’s trust in your brand appeared first on Search Engine Land.

]]>
A site migration SEO checklist: Don’t lose traffic /site-migration-seo-checklist-dont-lose-traffic-286880 Wed, 22 Nov 2017 18:05:40 +0000 /?p=286880 Planning a site migration? Columnist Pratik Dholakiya's helpful guide will ensure that you cover all the SEO bases to make the transition as smooth as possible.

The post A site migration SEO checklist: Don’t lose traffic appeared first on Search Engine Land.

]]>

Few things can destroy a brand’s performance in the search results faster than a poorly implemented site migration.

Changing your domain name or implementing HTTPS can be a great business move, but if you fail to consider how search engines will react to this move, you are almost certain to take a major hit in organic search traffic.

Use the following SEO checklist to prepare yourself as you develop a migration game plan for your website.

1. Carefully consider if migration is the right choice

A site migration will almost always result in a temporary loss of traffic — Google needs time to process the change and update its index accordingly. A carefully executed site migration can minimize traffic fluctuations, and in a best-case scenario, Google will ultimately treat the new site as if it were the original.

Still, that is only the best-case scenario. The reality is that site migrations, in and of themselves, typically offer little to no SEO benefit and do not eliminate search engine penalties. (That is why SEOs often use site migrations as an opportunity to make SEO improvements, like streamlining the site structure, fixing broken links, consolidating redundant pages and making content improvements.)

With all of that in mind, when is a site migration worth it?

  • When a strong rebranding is in order.
  • When migration will generate press and links.
  • When the site needs to be moved to HTTPS (one of the few cases in which migration alone offers an SEO gain).

2. Use a sandbox

Never do a site migration without first testing everything on a test server. Verify that the redirects work properly, and do all of the checks that follow in private before going public. Trying to do it all in one go without testing is bound to lead to errors, and if the mistakes are bad enough, they can set your site back by weeks.

3. Plan to migrate during a slow period

A well-planned and monitored migration shouldn’t permanently affect your traffic, but you should plan for a temporary dip. For that reason, it’s best to perform the migration during a slow part of the year, assuming that there is some seasonality to your site’s performance. A site migration during or shortly before the holidays is always a bad idea. While the goal should always be to avoid losing any traffic, it’s important to make sure that if you do lose traffic, you lose it when business is already slow.

4. Crawl your site before the migration

Crawl your site with a tool like Screaming Frog, and be sure to save the crawl for later.

You need to make sure you have a complete list of the URLs on your old site so that nothing ends up getting lost because of the transition.

Use this as an opportunity to identify any crawl errors and redirects that exist on the old site. These have a tendency to creep up over time. I rarely come across a site that doesn’t have at least some broken or redirected links.

You should absolutely remove or replace any links that point to 404 pages during the migration process. In addition, I highly recommend updating any links that point to redirected pages so that they point to the final page. You do not want to end up with redirect chains after the migration.

Remember that a site crawl may not be able to identify every single page on your site. For example, if you have pages that aren’t linked from other pages on your site, they won’t show up in a crawl. You can use your own records and databases to find these pages, of course, but if this isn’t possible, you can find these pages in your Google Analytics data, as well as through a link explorer like Ahrefs.

If you find any orphan pages, make sure to update the site, and link to these during the migration. These pages are much less likely to pick up search engine traffic if they aren’t linked to from the rest of your site.

5. Benchmark your analytics

Make a copy of your Google Analytics data; you will need this information so that you can quickly identify if any traffic is lost after the migration.

If any traffic is lost, export the Analytics data from your new site and run a side-by-side comparison with the data from your old site, so that you can identify precisely which pages lost the traffic. In many cases, a loss of traffic will be isolated to individual pages, rather than taking place across the entire site.

You may also want to identify and take note of your top linked-to pages using a tool like Ahrefs. After the migration, you will want to pay special attention to these pages and monitor them closely. If these lose traffic, it is a sign that the authority isn’t being properly transferred from your old site to the new one. These pages contribute the most to your authority, so losses here may affect the overall performance of your site.

6. Map all changed URLs from old to new

You should have a spreadsheet that lists every old URL and every new URL.

Ideally, during a site migration, all of the old pages exist on the new site. Obviously, removing a page removes its ability to capture search engine traffic. On top of that, dropping too many pages during the migration may lead Google to conclude that the new site isn’t the same as the old site, causing you to lose your rankings.

Also, ideally, the URL architecture should be identical to the old one unless you have very strong reasons to change it. If you do plan on changing it, a site migration may seem like the ideal time to do it, but you should be aware that doing so may cause Google to see it as an entirely different site. If you do both at the same time, you will not be able to determine whether any losses in traffic were the result of changing the architecture or of migrating the site.

Another reason to keep the architecture the same is that it allows you to use regex in your .htaccess file to easily redirect from your old pages to the new ones. This puts less load on your server than naming the redirects one by one, and it makes the process of setting up the redirects much less painful.

7. Update all internal links

The HTML links on your new site should point to the new site, not the old one.

This might sound obvious, but as you go through the process, you will quickly realize how tempting it might be to leave the links unchanged, since they will redirect to the new URL anyway. Do not succumb to this temptation. Apart from the server load, which slows down site performance, the redirects may dampen your PageRank.

The ideal way to rewrite the links is by performing a search and replace operation on your database. The operation should be performed so that it updates the domain name without changing the folder structure (assuming you’re keeping your site structure the same).

Write your search and replace operations carefully so that only text containing a URL is updated. You generally want to avoid updating your brand name and your URLs with the same search and replace operation.

8. Self-canonicalize all new pages

Verify that canonicalization on the new site references the new site and not the old. Canonicalizing to the old site can be disastrous, as it may prevent the new site from being indexed.

I recommend self-canonicalizing all of your pages on the new site (except, of course, for pages that should canonicalize to another page). In combination with the redirects, this tells Google that the new site is, in fact, the new location of the old site. Sitewide self-canonicalization is recommended anyway, since URL parameters create duplicate content that should always canonicalize to the parameter-free URL.

9. Resolve duplicate content issues

Various missteps during the migration process can result in duplicate content issues. Be aware of these issues, and take steps to avoid them:

  • If both multiple versions of a URL are published, it results in duplicate content. If self-canonicalization is put in place properly, this should take care of the issue, but I always recommend setting up redirect rules in .htaccess so that only one version of the page is accessible. Make sure that links are consistent to avoid redirects from internal links.
  • IP addresses should redirect to URLs.
  • Look out for folders that lead to the same content, especially “default” folders.
  • Verify that only HTTPS or HTTP is used and that only the www or non-www version of the site is accessible. The others should redirect to the proper site.
  • If your site has a search function, the search result pages should be noindexed.
  • I mentioned this earlier, but self-canonicalization should be in place to avoid duplicate content created by URL query strings.

10. Identify and address any removed pages

I mentioned above that you should generally avoid removing any pages during the migration. If some pages simply must be removed for branding purposes, take the following steps:

  • Make a list of all the pages.
  • Do not redirect the old pages to the new site.
  • Remove all links from these pages.
  • Remove the pages from the old site and allow them to redirect to 404.
  • If there is a suitable replacement for the page, set up a redirect and change all of the links to point to the new page. You should only do this if the replacement page serves the same purpose as the old page.
  • Do not redirect the removed pages to the home page (also called a “soft 404”). If there is no suitable replacement for a page, it should 404. A 404 is only an error if you link to the page.

11. Ensure that a custom 404 page is in place

A custom 404 page allows users to easily navigate your site and find something useful if they land on a page that no longer exists.

12. Manage and submit sitemaps

Keep your old sitemap in the Google Search Console, and add the sitemap for the new site as well. Requesting Google to crawl the old sitemap and discover the redirects is a good way to accelerate the process.

13. Keep analytics in place at all times

Install Google Analytics on the new domain and get it up and running well before you launch the site to the public. You do not want to have any missing data during the transition, and it’s important to watch for any changes in traffic during the migration.

14. Redirect all changed links

As mentioned above, the ideal way to set up your redirects is with a regex expression in the .htaccess file of your old site. The regex expression should simply swap out your domain name, or swap out HTTP for HTTPS if you are doing an SSL migration.

For any pages where this isn’t possible, you will need to set up an individual redirect. Make sure this doesn’t create any conflicts with your regex and that it doesn’t produce any redirect chains.

Test your redirects on a test server and verify that this doesn’t produce any 404 errors. I recommend doing this before the redirects go live on your public site.

Keep in mind that once the redirects go live, your site has effectively been migrated. The new site should be in pristine condition before setting up the redirects.

15. Keep control of the old domain

Unless the purpose of the migration was to sell the original domain, I would strongly advise against giving up control of the old domain. Ideally, the old domain should redirect to the new one, on a page-by-page basis, indefinitely. If those redirects are lost, all of the inbound links earned by the old site will also be lost.

Some industry professionals claim that you can give up control of the old domain once Google stops indexing it, but I would never advise doing this. While it’s possible that Google will attribute links pointed at the old site to the new one, even without the redirect, this is placing far more faith in the search engine then I would ever recommend.

16. Monitor traffic, performance and rankings

Keep a close eye on your search and referral traffic, checking it daily for at least a week after the migration. If there are any shifts in traffic, dive down to the page level and compare traffic on the old site to traffic on the new site to identify which pages have lost traffic. Those pages, in particular, should be inspected for crawl errors and linking issues. You may want to pursue getting any external links pointing at the old version of the page changed to the new one, if possible.

It is equally important to keep a close eye on your most linked pages, both by authority and by external link count. These pages play the biggest role in your site’s overall ability to rank, so changes in performance here are indicative of your site’s overall performance.

Use a tool like SEMrush to monitor your rankings for your target keywords. In some cases, this will tell you if something is up before a change in traffic is noticeable. This will also help you identify how quickly Google is indexing the new site and whether it is dropping the old site from the index.

17. Mark dates in Google Analytics

Use Google Analytics annotations to mark critical dates during the migration. This will help you to identify the cause of any issues you may come across during the process.

18. Ensure Google Search Console is properly set up

You will need to set up a new property in Google Search Console for the new domain. Verify that it is set up for the proper version, accounting for HTTP vs. HTTPS and www vs. non-www. Submit both the old and new sitemaps to solidify the message that the old site has been redirected to the new one.

Submit a change of address in the Google Search Console, request Google to crawl the new sitemap, and use “fetch as Google” to submit your new site to be indexed. It is incredibly important to verify that all of your redirects, canonicalizations and links are error-free before doing this.

19. Properly manage PPC

Update your PPC campaigns so that they point to the correct site. If your PPC campaigns are pointing to the old site, attribution will be lost in Analytics because of the redirect.

20. Update all other platforms

Update all of your social media profiles, bios you use as a guest publisher, other websites you own, forum signatures you use, and any other platforms you take advantage of, so that the links point to the new site and not the old.

21. Reach out for your most prominent links

Contact the most authoritative sites that link to you in order to let them know about the migration, and suggest that they update the link to point to the new website. Not all of them will do this, but those that do will help accelerate the process of Google recognizing that a site migration has occurred.

I wouldn’t recommend doing this with every single link, since this would be extremely time-consuming for most sites, but it is worth doing this for your top links.

22. Monitor your indexed page count

Google will not index all of the pages on your new site immediately, but if the indexed page count is not up to the same value as the old site after a month has passed, something has definitely gone wrong.

23. Check for 404s and redirects

Crawl the new site to verify that there are no 404s or 301s (or any other 3xx, 4xx, or 5xx codes). All of the links on the new site should point directly to a functioning page. The 404 and 501 errors are the biggest offenders and should be taken care of first. If there is a suitable replacement for a 404 page, change the link itself to point to the replacement, and verify that a 301 is in place for anybody who arrives at the missing page through other means.

The second-worst offenders are links to 301 pages that exist on the old site. Even though these redirect to the new site, the server load is bad for performance, and linking back to the old site may lead to confusion over the fact that a site migration has taken place. While all of the other efforts taken should clarify this to Google and the other search engines, these things are best never left to chance.

Any other 301s can be taken care of after this. Always update your internal links to point directly to the correct page, never through a redirect.

24. Crawl your old URLs

Use Screaming Frog or a similar tool to crawl all of your old URLs. Be sure to crawl a list of URLs that you collected before the migration, and make sure the list includes any URLs that were not discoverable by crawling. Do not attempt to crawl the site directly; the 301s will cause it to crawl only the first page.

Verify that all of the old URLs redirect to the new site. There should not be any 404s unless you removed the page during the migration process. If there are any 404s, verify that there are no links to them. If the 404s are not intended, set up a proper redirect.

Check the external URLs to verify that all of the redirects are functional. None of the external URLs should be 301s or 404s. A 301 in the external URLs is indicative of a redirect chain and is bad for performance. A redirect to a 404 will lead to a very frustrating experience for your users and may hurt your SEO in other ways.

Conclusion

If a site migration is carried out without taking SEO into account, you can almost bet on losing search engine traffic in the process. Other than clients who have approached me after being penalized by Google, the worst SEO predicaments I’ve come across were the ones caused during a site migration by professionals who didn’t consider how search engines would react to the process. Keep all of the above in mind if you are planning to migrate your site, and it should go off without a hitch.

The post A site migration SEO checklist: Don’t lose traffic appeared first on Search Engine Land.

]]>
Nofollow links are not useless: Earning them Is central to good SEO /nofollow-links-not-useless-earning-central-good-seo-285412 Wed, 25 Oct 2017 14:50:04 +0000 /?p=285412 Some SEOs discount nofollow links, but contributor Pratik Dholakiya argues such an approach could result in many missed opportunities.

The post Nofollow links are not useless: Earning them Is central to good SEO appeared first on Search Engine Land.

]]>

With major publishers like Inc., Forbes and The Huffington Post placing the rel=”nofollow” tag on their external links, the sky is once again falling. Or not. In fact, I’ve always believed that earning nofollow links was an important part of any SEO strategy built to last.

The reality of the situation is that nofollow links are good for your SEO, full stop. Whether your evidence comes from case studies, personal experience or correlative data, the answer is the same.

We can debate about whether nofollow links have any direct impact on rankings until the cows come home, but in the end, it just doesn’t matter that much. What matters is that, if you are earning nofollow links on high-profile platforms, you are earning brand exposure, referral traffic and various off-site signals that do help your rankings in the search results.

I have witnessed the effect myself far too often to conclude otherwise, and anybody who has been in this industry long enough knows that you shouldn’t decide to pursue — or decline to pursue — a link based upon whether or not a link is nofollowed.

Let me present the evidence, and then I’ll explain how you can make the most of link building by incorporating nofollow links in the appropriate way.

Nofollow links can definitely help SEO: The evidence

I feel comfortable saying that nofollow links definitely help your SEO, although most of the benefits are probably indirect. The exposure associated with a high-profile nofollow link is well worth the effort and contributes positively to your visibility in search results, as well as sending direct referral traffic and improving brand reach. It also appears to be almost indisputable that nofollow links help pages get indexed.

It’s more speculative to say that nofollow links can, in some cases, directly improve your rankings, and I won’t commit to a statement that strong. What I can say is that search engines reserve the right to ignore the nofollow tag, and I suspect that they do for some links they view as editorially placed and trustworthy.

Remember that Google’s own answer is that “In general, we don’t follow them.” (Emphasis mine.) This seems to imply that, while they usually don’t follow them, they sometimes do.

Perhaps more importantly, if your link-building strategy places importance on whether or not a link is nofollowed, then you are using the wrong link-building strategy. Google guidelines have been clear on this for a very long time. If you’re doing something just for the SEO value, it’s probably a violation of the Google guidelines.

Your link-building strategies should be focused on building exposure that leads to organic SEO signals. That is where the real value is.

But let’s not talk about platitudes. Let’s talk about evidence.

Case studies

Consider this case study by TekNicks. Between January of 2014 and May of 2015, they helped a client earn 99 links. Of those, only 11 were followed. The remaining 88 links were all nofollowed links — 89 percent of the total.

But during that period, the client saw 288 percent growth in their organic search traffic. At the end of the period, the client ranked in their top position for their main keyword, which TekNicks claims is “very competitive,” and which receives 2,000 monthly searches.

At the end of this period, they additionally ranked for an even more competitive keyword, with 8,100 monthly searches. For the period, organic traffic grew from 1,700 sessions a month to 6,500 sessions.

But, perhaps equally importantly, one of the nofollow links they earned sent 3,922 referrals between January and October of 2014.

And TekNicks isn’t the only agency to experience something like this. Fractl has three excellent examples of nofollow links working wonders for clients, demonstrating the power of media exposure.

They developed an infographic called “Your Face as an Alcoholic” for client Rehabs.com, which quickly hit the front page after they shared it with the Daily Mail in 2014. The resulting exposure led to coverage in 900 media stories, including The Huffington Post and the New York Daily News.

Only 30 percent of those newly-earned links were dofollow, and they earned over 14,000 shares on social media.

In a second example, Fractl placed a story for a client on Yahoo Travel, exposing how expensive hotels often have more germs than cheaper hotels. This featured article led to coverage in 700 stories, a third of which contained dofollow links, as well as 23,000 social shares.

Finally, one Fractl client saw a 271 percent increase in organic search traffic resulting from an exclusive, but nofollowed, link on BuzzFeed.

In a more controlled test, Eli Schwartz of SurveyMonkey demonstrated that, at a minimum, nofollow links definitely help pages get indexed.

After SurveyMonkey moved its blog from the subdomain to their root domain, Eli ran a few tests on the old subdomain URLs. He modded the 404 page, including a link to a page with bogus anchor text. Google crawled the 404 page and indexed the test page in under 48 hours, after it was included in a newly published item. The resulting link even carried the anchor text.

Running the same test again with a link to a different page, he tried using a nofollow link instead. As you can probably guess, Google indexed the URL, even though the hyperlink was nofollowed. He did notice, however, that the anchor text didn’t carry over.

How Google treats nofollowed links

Nofollowed links are also typically accompanied by brand mentions. According to a Google Patent, brand mentions may be considered “implied links.” In other words, if a brand gets mentioned online, this may be treated in a similar manner to an actual link. While we don’t know for sure, a brand mention along with a nofollow link may also help the search engines in understanding the semantic link between a brand mention and the website it refers to, since brand mentions are less clear due to their less explicit nature.

Whether “co-citation” of this form helps traditional search results, it’s certainly clear that citations help local search. In one example, local SEO Phil Frost explains how including citations (with name, address and phone number) in a press release helped a client move from position 20 to position 1 in local search results for their primary keyword. In this case, despite the links being no-followed, the citations clearly helped their client rank.

Case studies by Search Engine Land and Moz, in addition to more recent case studies that come out on a fairly routine basis, demonstrate that it is still possible to improve rankings using press release distribution. While we generally avoid this tactic unless it’s also used with the primary goal of generating press, it continues to be popular even though the majority of press release distribution sites now contain nofollowed links. That press releases still help with SEO is a testament to the value of nofollow links in this context, whether direct or indirect.

Correlative analysis of observational ranking data conducted by Ahrefs also suggests that a relatively even split between dofollow and nofollow links may help rankings. While correlative studies have their flaws, primarily because they can’t establish a cause and effect relationship, it would be a mistake to ignore them.

Likewise, Moz’s analysis of ranking factors finds a 0.32 correlation between the number of nofollow links pointing to a page and rankings. This is nearly identical to the correlation between the number of external domains linking to a page and its rankings, which sits just 0.02 higher, at 0.34.

One can rightfully argue that these correlation studies could just be showing us that successful pages are more likely to get linked to, and thus are more likely to receive nofollowed links. This is a reasonable objection, but it applies equally to followed links, and, with such a small difference in correlations, it does make one wonder if nofollowed links could actually contribute directly to rankings.

Regardless of whether or not this is the case, the case studies above demonstrate definitively that, directly or indirectly, nofollow links can have a dramatic positive impact on search engine rankings. My personal experience with nofollow links leads me to the same conclusions.

How to maximize the SEO value of nofollow links

1. Remember: The anchor text is meaningless for keyword rankings

Whether or not nofollow links can directly improve your rankings, it’s clear that the anchor text is most likely ignored entirely.

If you are earning nofollow links with SEO in mind, anchor text should be the last thing on your mind, or more accurately, you shouldn’t be thinking much about keywords when it comes to anchor text.

The primary value of the link is in getting people to visit your site directly, and that means the purpose of the anchor text is to get people to click through and see more. That means the anchor text should pique the reader’s curiosity as much as possible, promise them something in a clear and non-deceptive way or address objections the user might have to clicking the link.

2. Focus on an audience of influencers

Other than receiving direct clicks from your target audience, the main thing you want a nofollow link to accomplish is to earn additional followed links from trusted influencers.

Earning those links means producing content that appeals to journalists, thought leaders, microcelebrities and others who have large audiences of their own.

This means that your content should be going the extra mile, since influencers are generally the most voracious infovores in your industry. They know almost everything, and they aren’t easy to surprise.

How do you catch these people’s attention with your content? There are two primary methods:

  • Focus on novelty.
  • Focus on being comprehensive.

These can be subdivided into far more categories, but these are the primary things to focus on.

Focusing on novelty means providing influencers with things they’ve never seen before. The best examples of this type of content include:

  • orginal research, such as surveys, experiments, or studies
  • interactive tools like web apps
  • “investigative journalism”-style work that provides insider information
  • exclusive interviews
  • news
  • event coverage
  • proprietary information

Focusing on being comprehensive includes things like:

  • ultimate guides
  • white papers
  • how-to videos
  • e-books
  • courses
  • “30-Day Challenges”
  • introductions and primers
  • glossaries and dictionaries

In short, say something new, or distill something big.

If you do this, and then get your resource published on a major platform, it doesn’t matter whether or not your link is followed or nofollowed. What is important is how the exposure will lead to coverage in the press, on social networks, on blogs and magazines and so on.

By making influencers your audience, you maximize your reach and SEO impact.

3. Use the opportunity to mention your brand

As I mentioned above, Google patents suggest that a simple mention of your brand can help improve your visibility in the search results. Such brand mentions may be treated as “implied links” and, if so, likely carry similar authority metrics, so that a mention in a more authoritative media platform results in a stronger rankings boost.

Whether Google has actually put this patent to use and found that it helped their rankings algorithm is unclear, but brand mentions are valuable for obvious reasons, and can indirectly benefit your SEO as well.

Brand mentions lead to increased searches for your brand name, which in turn can help your rankings in a virtuous feedback cycle.

While you shouldn’t name-drop shamelessly, don’t skip the chance to promote your brand when you place a nofollow link on an authoritative platform.

4. Leverage social media

Failure to pursue nofollow links can hurt your SEO performance in many ways, but one of the worst consequences is the tendency to avoid techniques that involve (typically nofollowed) social media.

Google has explicitly stated many times that there are no special ranking factors developed for social networks.

Since Google evidently doesn’t use “likes” and “retweets” as ranking factors, and since links on social platforms are nofollowed, some in the SEO industry ask, “Why bother?”

Well, for starters, as I mentioned above, Google’s own statements on nofollow suggest that they sometimes do count nofollowed links, even though in general they don’t. Moz’s correlation studies certainly find very strong correlations between social media activity and rankings. Could the nofollowed links from this social activity be counting toward rankings?

There’s no way to know for sure, but the correlation is meaningful either way.

What social media undoubtedly can do is earn you attention that leads indirectly to links. Viral activity on social networks inevitably leads to media coverage and followed links. Scrapers also replicate links from social media in other locations, often without the nofollow tag.

Social media platforms are perhaps the most effective way to amplify your content’s reach in the short term. In addition to sharing your content with your own audience, you can leverage other influencers by reaching out to personalities that are popular on social media. If you do so tactfully, you can reach much larger audiences. This activity inevitably leads to naturally earned links, as well as various other off-page activity that helps improve your rankings.

5. Republishing

Republishing your content on major platforms is a tactic that frequently results in nofollowed links, but if the platforms attract a large enough audience, this is well worth the effort. Since many bloggers and editors refer to major media platforms for their sources, if you can get republished on a major platform, you can earn editorial links from the writers who cite those platforms frequently.

While it’s true that some of these writers will cite the republished version, more vigilant writers will click the nofollowed link and cite your website as the original source, since links to primary sources are preferred by writers who take research seriously.

More speculatively, it’s possible that under some circumstances, Google will see the duplicate content and identify your original publication as the primary source, and as a result, transfer the search engine authority from the other duplicates to your original. I have witnessed effects that seem to imply this is happening, although it would be hasty to conclude with too much certainty that this is exactly what is going on.

Either way, it’s as clear as day that republishing content on more popular platforms expands your reach, puts your brand in front of more eyeballs and increases your likelihood of getting cited with a link by other writers.

6. Get obsessed with referral traffic

Too many in this industry are focused on building links without concerning themselves with whether or not those links actually send any referral traffic.

It’s been said many times but it can never be said enough: the most valuable links are the ones you earn organically and editorially — when people link to you without you reaching out or doing anything else to earn the link.

I’m not arguing that those are the only kinds of links you should be earning, but if you optimize your own “manual” link-building efforts in such a way that it generates the largest number of organic links, you are approaching link building the way you should be.

Few things more reliably produce organic links than sheer traffic. It’s probable that a certain percentage of your readership will always end up linking to you if you have enough readers. So if you can expand the number of people who see your content, you can expand the number of people who will link to you.

Oh, and referral traffic is valuable on its own, too. But you knew that already, right?

So, how do you go about earning nofollow links that send traffic?

I would argue that the primary thing to focus on is earning links that grow your traffic in a cumulative fashion. In other words, it’s not the link that sends you a thousand visits one day and zero the next that you really want to chase. It’s the link that sends 100 visits a day every day for the foreseeable future that you really want to get your hands on.

Here are some of the types of links that can help you accomplish that:

  • Quora. If you’ve ever answered a question on Quora, you’ve probably noticed that while the referral traffic numbers aren’t necessarily high for any given question, you tend to see traffic from Quora for a very long time after posting an answer. Build up a lot of these and you will start seeing cumulative growth in referral traffic.
  • Pinterest. While its traffic-driving power isn’t quite as strong as it was when it first made a splash, it is still an incredibly useful referral source that sends a lot of traffic when an image does really well.
  • Forums. I know they seem like a throwback from the 1990s, but forums are still incredibly popular, and if you use them in a similar fashion to Quora, they can send you long-term cumulative traffic, especially if the forum allows you to link to your site in your forum signature or elsewhere.
  • SlideShare. Presentations here can attract a very different type of audience and can be a constant source of traffic, especially if you are in the B2B sector.
  • Interviews. Interview an influencer, and they are likely to promote that interview on their own platforms. If they publish it on their site, the link can sometimes become evergreen and send a continuous drip of traffic.
  • Resource lists. These are especially popular on educational sites. Inclusion in somebody’s resource list is almost guaranteed to be an evergreen traffic source if their site has enough traffic.
  • YouTube. A YouTube video that does well with the algorithm becomes an evergreen source of brand mentions and traffic.
  • Podcasts. These can be a great source of long-term traffic for the same reasons as YouTube videos.

If you stop chasing the followed link and shift your obsession toward upping your referral traffic, you start to realize how unimportant the nofollow tag really is, both in terms of growing your overall traffic, and even in improving your authority with the search engines.

The myth that nofollow links are useless for SEO needs to die. A solid SEO strategy is not concerned with whether the manual links you place will directly impact your SEO. A thorough reading of the Google guidelines should, in fact, lead you to the conservative assumption that no link you place yourself counts toward rankings. The indirect effects are where the true value lies, and it is where you should be focusing the majority of your effort.

The post Nofollow links are not useless: Earning them Is central to good SEO appeared first on Search Engine Land.

]]>
7 on-site SEO problems that hold back e-commerce sites /7-site-seo-problems-hold-back-e-commerce-sites-283299 Thu, 28 Sep 2017 14:23:35 +0000 http:/?p=283299 Is your e-commerce site experiencing weak organic traffic? Columnist Pratik Dholakiya shares some common issues that impact SEO for online retailers and offers a few suggestions.

The post 7 on-site SEO problems that hold back e-commerce sites appeared first on Search Engine Land.

]]>

Not long ago, I talked about 16 very specific on-site SEO mistakes that I see very often, and how to fix those issues.

Today, I want to shift the focus toward problems that plague e-commerce sites specifically. I’ll also be addressing on-site problems that have a bit more to do with strategy and a bit less to do with specific technical mistakes.

Finally, I wanted to make sure we had some real-world examples to refer to, so I mined case studies from the industry to demonstrate the concrete impact these changes can have on your search traffic.

Let’s take a look at these problems and what you can do to resolve them.

1. Weak product descriptions (or none at all)

Since e-commerce sites usually have a very large number of products, it’s common for product descriptions to be short, automated and provided by the manufacturer.

Unfortunately, this creates a few problems for SEO:

  • Short descriptions give the search engines very little content to work with, and this is a problem. After analyzing 1 million search results, Backlinko concluded that longer content generally performs better: The average Google first page result is 1,890 words long.
  • Automated descriptions that swap a few words into a template can create duplicate content issues.
  • Descriptions provided by the manufacturer are almost certainly replicated on other sites, meaning that you are not providing anything unique for the search engines to index. This means the search engines have no reason to rank you above competitors.

It’s not always possible to manually update the product descriptions for every page on your site, but this action isn’t strictly necessary to resolve these issues. A focus on turning just a few of the highest-value product pages into full-fledged landing pages with conversion-based copy can have a dramatic effect on the rest of the site.

An Australian retailer named Toy Universe was able to increase its search engine traffic by 116 percent in just four months. That doubling in traffic also doubled sales. While many changes were involved in that boost, a large chunk of the effort went toward work on the product descriptions.

Put simply, the site did not originally feature any unique product descriptions or unique content for the category pages. Adding them in was a huge piece of the puzzle.

The Motor Bookstore serves as another classic example.

When Google first released the Panda update, this online automotive bookseller saw a 38.5 percent drop in organic search traffic overnight. The brand was well respected by its customers, but their product descriptions were supplied by the publishers; as a result, those descriptions were identical to the descriptions found on many other sites.

That duplicate content didn’t look good to Google — hence the drop in traffic after Panda was introduced. (These days, Panda is baked into Google’s core ranking algorithm, so your site could be affected by it without your knowing.)

Eliminating duplicate product descriptions and replacing them with unique descriptions can help resolve this issue. Opening up your site to user-generated content like reviews can also help by introducing new content to reduce the proportion that is duplicate — a strategy that has obviously worked wonders for Amazon.

On that note…

2. Not including user reviews

In addition to diluting duplicate content, user reviews seem to affect search results in other ways. It’s not entirely clear whether the presence of reviews affects search engine results directly or indirectly, but the impact is clear and unambiguous.

Yotpo conducted an analysis of over 30,000 businesses that added user reviews to their site and measured how this impacted organic search traffic. The results were stark: Over a period of nine months following review implementation, they found that Google organic page views per month grew by over 30 percent.

Including user reviews can be scary, as this allows buyers to leave negative feedback on your products. But there is a wealth of evidence that including user reviews increases conversion rates. In fact, in a bizarre twist of fate, more diverse product ratings improve conversions better than five-star-only reviews.

If you’ve been hesitating to include user reviews because of concerns about negative feedback or due to the difficulties of implementation, I highly recommend you take the plunge now. The impact is almost sure to be positive.

3. No unbranded keyword optimization

Perhaps one of the most common issues is that many e-commerce product pages are simply not developed with keywords in mind.

The typical product page is built around a brand and model name. It’s certainly true that some consumers may be searching for these names, and they should definitely be included in the title tag and the other important locations on the page.

That said, most consumers are likely not searching by brand or model name, especially when it comes to more obscure brands.

For that reason, it’s important to include more generic, popular phrases on your pages as well.

This isn’t to say that you should abandon any more niche keyword usage. What I mean is that you should be going after phrases that consumers are using when they search for products like yours, and that means going deeper than branding to focus on the actual mechanics of the consumer journey.

White Hat Holsters did just that, and the result was a 100 percent increase in sales and a 400 percent increase in search engine traffic. The traffic grew from 2,000 to 8,000 visits per week in just eight months.

To accomplish this, they:

  1. used keyword tools and competitive research to identify phrases that consumers were actually using to find products like theirs.
  2. analyzed the meta descriptions, image alts, URLs and headings.
  3. chose three to five closely related keywords to target for each page and updated the above-the-fold region of the pages to semantically reflect those keywords.
  4. created a blog to capture keywords searched for by consumers who were further up the funnel.

4. Focusing too heavily on transactional keywords and not developing informational content

It’s incredibly difficult to rank for “money” keywords, and it’s usually a failing strategy to focus too heavily on them, especially if this means you are neglecting the informational keywords that target customers who are a bit further up the funnel.

By shifting some of your attention toward less transactional keywords and toward more informational ones, you can rank for less competitive keywords and build a stronger reputation with the search engines.

Ranking for these less competitive phrases doesn’t just add traffic for those individual phrases; it can also improve your site’s overall reputation with the search engines. This may be because it influences behavioral metrics. Whatever the cause, I’ve personally witnessed this effect many times.

Darren Demates helped a medical e-commerce site skyrocket its search traffic by an incredible 1,780 percent using an interesting keyword method he calls the “double jeopardy technique.” Here was his process:

  1. He obtained keywords by adding the page URLs into the Google keyword planner, instead of the usual method of guessing keywords and looking at the related suggestions. He also put the focus on informational keywords instead of transaction keywords.
  2. He used SEMrush to find the keyword difficulty for the keywords recommended by the Google keyword planner, then weighed the difficulty against the potential traffic in order to make a judgment call about which keywords to focus most heavily on.
  3. He used a “site:example.com [keyword]” search to identify which page on the site already had the most ranking potential for that keyword.
  4. He searched forums using an “inurl:forum [keyword]” search to find the types of questions people were asking about his informational keywords.
  5. He updated the thin blog posts on his site to answer all of the questions he could find on forums that people had about the keywords.

I’d recommend taking notes here and putting this to use. The possibility of increasing search engine traffic by an entire order of magnitude isn’t the kind of thing you want to ignore.

5. Implementing poorly planned site redesigns

This one hurts to watch.

I’ve had clients rush ahead with a site design without notifying me, and I’ve had new clients who approached me after a site update tanked their rankings.

This experience is incredibly painful, because a site redesign intended to modernize and beautify a site, or to implement changes expected to maximize conversions, can end up obliterating your organic search traffic. Few things hurt more than dropping a wad of cash on something and having it backfire on you.

If you implement a site redesign without taking SEO into account, this is almost bound to happen. Pages that ranked well can get lost, content that was pulling in traffic can get rearranged, and the results of past wins can get lost.

Seer Interactive assisted one retail client who had redesigned their site in order to secure the site with HTTPS. Their redesign caused their organic traffic to plummet by a staggering 75 percent. The situation was so bad that they no longer ranked for their own brand name.

What happened?

  • The redesign deleted several key pages that had been pulling in traffic. This didn’t just cause the traffic from those pages to get lost, it also created 404 errors where other pages on the site still linked to the missing pages. This can cause PageRank to drop like a brick.
  • The site had a new URL structure, meaning all of the links pointing to the old pages were now pointing to nothing, and all of the authority the site had built up in the past was tossed right out the window.
  • The redesign introduced copies of pages, producing duplicate content that may have caused the site to be algorithmically penalized.

After fixing those issues and introducing a long-term content strategy, the site experienced a 435 percent growth in search traffic. This led to a 150 percent increase in transactions, and a 64 percent increase in revenue. This was accomplished in six months.

Do not execute a site redesign without the help of an SEO professional. The results can be absolutely horrifying.

6. Poor migration between e-commerce platforms

It’s a safe bet that most e-commerce sites are built using third-party platforms. This is a mutually beneficial arrangement that allows the e-commerce business to focus more on its core business and less on web development.

It’s not uncommon for a site to outgrow one platform and switch to another as their market share increases, or to switch platforms in order to gain access to previously unavailable features. Unfortunately, switching e-commerce platforms can sometimes hurt rankings.

In one case, TotalHomeSupply.com found itself losing 37 percent of its traffic after switching from Volusion to Mozu. Despite Mozu being owned by the same company and serving as the enterprise-level version of the same platform, the transfer led to technical SEO issues. (This isn’t a knock against Volusion — this can happen with any e-commerce platform if you aren’t careful.)

The drop in traffic led to a 21 percent drop in transactions, despite a 24 percent boost in the conversion rate that Mozu may have contributed to.

The issue with Mozu was that the pagination was handled by JavaScript instead of HTML. Inflow worked together with Mozu to eliminate the JavaScript issues, allowing Google to properly crawl the pagination, which was invisible to the search engines when JavaScript was involved.

In addition, they trimmed thin content that had led to a demotion from Google’s Panda update and introduced new, high-quality content.

The result was a doubling in year-over-year organic revenue and a restoration of organic traffic to levels higher than before the site migration.

As with site redesigns, make sure an SEO professional is involved any time you update your e-commerce platform.

7. Not optimizing for your most promising keywords

In the section on informational keywords, we mentioned Darren Demates’ “double jeopardy” technique. In addition to focusing on information keywords, part of the reason for the strategy’s success also lies in the fact that it leverages the keywords that already show promise.

We mentioned that his technique involves performing a site: search to identify which page on the site already ranks best for any given keyword.

A related method of identifying keywords is to analyze your existing rankings to see which keywords are already performing well, and to make changes in order to better optimize for those keywords.

This is what Digital Current did for Sportsman’s Warehouse.

They identified “low-hanging fruit” pages which were already ranking fairly well for keywords, then updated those pages by tweaking the titles, headers and content in order to better reflect the keywords. They were careful to focus on keywords which would be “in season” shortly after the changes were made.

In addition, they performed some link building and improved the quality of the on-site content.

The changes resulted in a 31 percent year-over-year increase in organic search traffic and a 25 percent year-over-year increase in organic search revenue. This was a tripling on the ROI they had paid for the SEO work.

There are two primary methods you can use in order to optimize for promising keywords. The first is to run your URLs through the Google Keyword Planner as in the “double jeopardy” technique.

The second is to look at your keyword rankings in the Google Search Console or a tool like SEMrush to identify keywords that are already ranking on the second page or so. If these keywords are ranking without having already been optimized, they are a golden opportunity, and you should capitalize on them by updating your titles, headings and content.

This second approach is sometimes called the “low-hanging fruit” technique.

In the process, it’s important to make sure you aren’t cannibalizing your rankings for more important keywords, and, of course, to verify that the changes in the content will be useful for users and will not detract from the primary message of the existing page.

Time to put this information to use!

Don’t close that browser tab just yet. Leave it open and start taking a look at your site. Take a look at the problems I’ve listed here, and ask yourself if you’re facing any of them right now. You’ll thank me when your search traffic starts climbing.

The post 7 on-site SEO problems that hold back e-commerce sites appeared first on Search Engine Land.

]]>
16 common on-site SEO mistakes you might be making right now /16-common-site-seo-mistakes-might-making-right-now-278702 Wed, 02 Aug 2017 16:25:46 +0000 http:/?p=278702 Columnist Pratik Dholakiya shares the 16 technical SEO issues he sees most frequently. Even the most experienced SEO professionals can sometimes overlook these common issues!

The post 16 common on-site SEO mistakes you might be making right now appeared first on Search Engine Land.

]]>

Editors note:  It has been called to our attention a number of points in this article are outdated or incorrect. We plan to update the article shortly.

SEO is more than inbound marketing. There’s massive overlap, but there’s a technical side to SEO that sometimes gets neglected, especially by casual followers of the industry.

As somebody who spends a great deal of time looking at sites searching for opportunities to optimize, I notice patterns that creep up often: technical mistakes that show up again and again.

Let’s go over these mistakes. If my experience is anything to go by, odds are high you’re making at least one of them.

1. Nofollowing your own URLs

There comes a time in every SEO’s life when they need to keep a page hidden from the search results — to prevent duplicate content issues, to hide member areas, to keep thin content pages out of the index, to hide archives and internal search result pages, during an A/B test and so on. This is perfectly innocent, perfectly noble and perfectly necessary. However…

… do not use the “nofollow” tag to accomplish this!

The “nofollow” tag doesn’t prevent pages from being indexed by the search engines, but it does ruin the flow of PageRank through your site.

For the very same reason, you should not attempt to sculpt the flow of PageRank through your site by using the “nofollow” tag. Let me explain.

The “nofollow” tag does prevent PageRank from passing through a link, but Google still takes into account the total number of links on your page when determining how much PageRank to pass. In other words, your followed links will pass the same amount of PageRank regardless of whether the other links on the page are nofollowed or not.

I still see this happening often: SEO newcomers and webmasters using “nofollow” tags on their own content, either thinking that it will prevent a page from showing up in the search results, or thinking that they can use it to direct PageRank to their most important pages. The “nofollow” tag accomplishes neither of these things.

When you use a “nofollow” tag, you are throwing away PageRank. Don’t do it, not even on pages that you don’t want indexed. If you want to keep a page out of the index, use this in your HTML head:

The above directive prevents the page from turning up in the search results but recommends that the search engine follow the links on the page. That way, any PageRank that flows into the unindexed page will be passed back to your site through the links on the page, rather than getting dumped.

2. Not using canonicalization

The rel=canonical tag in the HTML head looks like this:

It tells search engines that instead of the current page, the linked URL should be treated as “canon” by the search engines.

Why would you use this tag? The purpose of it is to prevent duplicate content from getting indexed, which can result in diluting your search engine authority. Using the canonical tag also seems to pass PageRank from the non-canonical page to the canonical page, so there is no need to be concerned about losing the PageRank accumulated by the non-canonical page.

This is a place where conversion optimizers can often fail. Page alternates in an A/B test should make use of the canonical tag so that the alternate page doesn’t get indexed (and so that any authority picked up by the alternate page is passed to the primary page).

Variations on product pages, such as alternates with a different color, are another common example. Duplicates can also get created any time URL query strings are in use. For this reason, sitewide canonicalization can be a good solution for sites that make use of query strings. Self-referencing canonical pages are not generally thought to be an issue.

3. Poor use of outbound links

If you’re linking to another site in your site-wide navigation, and it’s not one of your social media profiles, odds are you should remove the link.

From a pure PageRank standpoint, external links dilute the authority that gets passed back to your own site. This isn’t to say that you shouldn’t be linking to anybody else (which would utterly defeat the purpose of using links as a ranking factor). But outbound links in your own site navigation compound the losses by affecting every page.

Of course, Google has come a long way since the original PageRank algorithm, but there’s another reason why external links in the navigation are iffy: It’s easy for them to look like spam.

The situation is, of course, far worse if the links use keyword anchor text or if the links are placed somewhere where they could be confused for internal site navigation.

Outbound links in the primary content are generally not an issue, but it is important to screen them for quality. Links to “bad neighborhoods” can get a site penalized by Google’s spam team or pushed down the rankings by anti-spam algorithms.

And, of course, it is absolutely crucial that you always nofollow advertisement links of any kind.

4. Not enough outbound links

The idea that “a little bit of knowledge is a dangerous thing” definitely applies here. A limited understanding of how the search engines work leads some to believe that they should never link to another site. While it’s true that the pure PageRank algorithm would suggest this, it’s simply not how things work out in the field.

A case study by Reboot Online makes a pretty clear case for this. They created 10 sites featuring a nonsense keyword, five featuring authoritative outbound links and five not.

The results were about as definitive as possible for a study of this size: All five of the sites with outbound links performed better than the sites without them.

In a post on PageRank sculpting by Google’s former head of web spam, Matt Cutts, he also mentions that “parts of our system encourage links to good sites,” which seems to confirm the idea that linking to other sites is important.

To be fair, John Mueller has openly stated that outbound links aren’t “specifically a ranking factor,” while adding that they “can bring value to your content and that in turn can be relevant for us in search.” In context of the Reboot Online study and Matt Cutts’s statement, this might be interpreted to mean that including citations boosts confidence in content, rather than meaning that outbound links have no effect at all.

Regardless, well-sourced content is a must if you want to be taken seriously — which may have a positive, if indirect, effect on rankings.

5. Poor internal link structure

There’s more than one right way to structure your links, but there are plenty of wrong ways to do it, too.

Let’s start with the basics. As the Google guidelines state:

Build your site with a logical link structure. Every page should be reachable from at least one static text link.

Your typical modern content management system will usually handle at least this much automatically. But this functionality sometimes gets broken. One dangerous myth is that you are supposed to canonicalize multiple page posts back to the first page. In reality, you should either leave well enough alone or canonicalize to a single page that contains the entire post. This goes for archives and similar pages, too. Canonicalizing these pages runs the risk of erasing the links on these pages from the search index.

A completely flat link architecture is another common issue. Some take the idea that every page needs to be accessible through links a bit too far, including links to virtually every page on the site within the navigation.

From the user perspective, this creates obvious issues by making it very difficult to locate appropriate pages.

But this confusion passes on to the search engines and the way that they interpret your site. Without a clear hierarchy, search engines have a very difficult time parsing which pages on your site are most important, which pages cover which topics, and so on.

Remember, there’s much more to the algorithm than PageRank. A categorical hierarchy helps search engines understand your site semantically, which is very important for rankings.

Watch out for tag clouds and long lists of dated archives. These show up less often in modern CMS themes, but they occur often enough that you should know they are to be avoided. Click-throughs on these are awful, and the extra links divide up PageRank. Dated archive lists, in particular, add no semantic information to your link architecture, and category links are much more organized than muddy tag clouds.

Finally, while it’s not exactly a mistake not to, we highly recommend referencing your own content within your body content. Contextual links within body content are generally believed to count more than links in the navigation, and they certainly add important semantic value.

6. Poor URL architecture

URL architecture can be a difficult thing to fix without breaking other aspects of your SEO, so we don’t recommend rushing into this, or you might do more harm than good.

That said, one of the most frequent issues I come across is a lack of solid URL architecture. In particular, folder organization is often spotty.

A few common issues:

  • Blog posts listed in multiple categories, resulting in blog posts listed in multiple folders, creating duplicate content issues as a result.
  • URLs with no folders other than the parent domain. While this is precisely the form your most important pages should take, pages further down the hierarchy should be listed in folders to categorize them.
  • URLs with folders that are, themselves, 404 pages. If a URL is listed under a folder, many users expect that folder to be an operational page. From an architecture perspective, it’s semantically confusing, and from an internal link perspective, it’s ideal to have links to these pages from a parent folder.
  • Junk URLs full of numbers and letters. These days, these are primarily reserved for search result pages and database queries that aren’t intended to be indexed and found in search engines. Your URLs should contain useful information intelligible to a human if you want them to contribute positively to your performance in the search engines.

In addressing these issues, there are two complications you want to avoid: creating 404 pages and losing existing link authority. When you change your URL architecture, you need to make sure that the old pages 301 to the new ones. Ideally, any internal links to the old pages should also be updated, since PageRank is reduced by the damping factor every time it passes through a link or 301.

As an exception, if blog posts are listed in multiple categories, a 301 isn’t always necessary, but in its place you should canonicalize to the preferable page.

7. Using frames

Frames and iframes are needed in a few places, but you should never use them for anything that you want to be indexed. Google is pretty clear on this:

Frames can cause problems for search engines because they don’t correspond to the conceptual model of the web. In this model, one page displays only one URL. Pages that use frames or iframes display several URLs (one for each frame) within a single page. Google tries to associate framed content with the page containing the frames, but we don’t guarantee that we will.

This isn’t to say that your site should never use them. YouTube embeds make use of iframes, for example.

What you absolutely should not do is use frames as a method of navigating content on your site. This not only makes the content difficult to index, it ruins your site architecture and makes it very difficult for people to reference your content with links.

8. Using unindexable formats

Search engines have limited ability to crawl and index the content found inside images, flash files, Java applets and videos.

As with frames, this isn’t to say that you should never use these formats for anything on your site. What it does mean is that you should never trust the search engines to properly index the content in these formats, and you should always provide alternate content for both users and search engines to access.

9. Not using transcripts

Failing to include transcripts or captions for videos is likely the most common failure associated with unindexable formats. Transcripts and captions allow search engines (and YouTube) to understand videos in a way that isn’t otherwise possible.

A study by Liveclicker found that 37 web pages saw a 16 percent increase in revenue when they added transcripts, and Digital Discovery Networks found that their captioned videos saw 7.32 percent more views on average.

If a transcript would take up too much space on your page, a scroll box is likely the best solution. Alternatives that include the content in the html but hide them from the user are likely to be considered cloaking and should be avoided for this reason.

10. Using image alt attributes incorrectly

As mentioned above, you should avoid using images in place of text, since it is difficult for the search engines to interpret the image and very unlikely that it will interpret it the same way as text.

One thing most webmasters these days are well aware of is the image alt attribute, often referred to as the “alt tag.” The alt tag is meant to provide a text alternative for an image if that image cannot be displayed. In other words, if the user is using a screen reader due to a visual impairment, or if their device is incapable of loading the image, he or she will be presented with the text of the alt attribute instead.

The problem is, a very large portion of webmasters are using it incorrectly. What I mean is that they are treating the alt tag as if it were a keyword tag of some kind, but that is not what it is intended for. All too often, I run into sites that stuff keywords into their image alt tags that have little or nothing to do with the image itself. Even when the keywords are relevant, they often don’t provide the information somebody would need if they can’t see or load the image.

That said, in general it’s considered good practice to keep the image alt below 125 characters. If the image is a large graph or infographic that would require a larger alt to explain, the text should be included elsewhere.

11. Unintentional cloaking

Google takes a strong stance against cloaking, but not every incident of cloaking is intentional. While the odds that unintentional cloaking will get you penalized are relatively low, it’s a good idea to avoid cloaking entirely to be on the safe side.

How does unintentional cloaking happen?

A classic example of cloaking is placing text on the site with a color that matches the background. This makes the text invisible to readers, while the search engines can still crawl it. In the past, spammers used to include keywords in hidden text like this, hoping that it would improve their visibility in search results. This hasn’t worked in a very long time, but some spammers do occasionally still try to use this “tactic.”

Unfortunately, this can also happen by accident, when certain elements of your style sheet are accidentally rendered the same color as the background. This should be avoided.

Another frequent accident to watch out for is empty anchor text links: href links with no anchor text. Too many of these may also be considered cloaking.

12. ‘Sneaky’ redirects

A “sneaky” redirect is any redirect that effectively cloaks the search engines from seeing the same page as the user, or sends the user to a page they weren’t expecting to visit. Google has an explicit stance against this as well.

I strongly recommend against using any method of redirecting users other than a 301 redirect, with the rare exception of 302 redirects if they really are intended to be temporary. Using any other method of redirection runs the risk of working for users but not for search engines, resulting in unintentional cloaking.

Related to this, avoid redirect chains, and don’t redirect to an unrelated page. Both of these practices are unfortunately quite common.

Redirecting to an unrelated page is something that is often done because some webmasters think that no URL that has previously existed should ever go 404. This actually isn’t true; Google prefers that you leave a page 404 as opposed to redirecting to an unrelated page, like the home page, for example. Redirects are intended to move users to identical pages, or pages that serve the same purpose, as the original page. Redirects to unrelated pages are considered “soft 404s” at best and sneaky redirects at worst.

Redirect chains throw away PageRank due to Google’s damping factor, and they may also be considered sneaky redirects if they appear to be misleading users or search engines, intentionally or otherwise.

13. Missing or duplicate meta descriptions

It amazes me how often I still come across sites that don’t seem to have heard about meta descriptions. This is one of very few places where search engines give you almost complete control, so don’t waste that opportunity. There’s not much to say here that can’t be found elsewhere, so I’ll leave it at that. I just can’t skip over this one, because I still see it very frequently.

A less obvious issue is the duplicate meta description. I usually see this happen because a template includes a meta description, resulting in entire sections of the site with the same description.

Often this is done intentionally, because developers have heard that every page should have a meta description, and this is their solution. Unfortunately, this actually does more harm than good.

Meta descriptions take the place of Google’s automated search snippet, and while Google’s automated snippet isn’t always optimal, it is bound to be better than a generic snippet designed for a swath of pages.

Then, there are the meta descriptions you shouldn’t have!

Yes, it’s a thing.

This is admittedly a bit of a controversial position, but I am of the opinion that not every page needs a meta description, and there are cases in which using a meta description can be counterproductive.

Consider the case of blog posts designed for long-tail. Google’s automated snippets grab bits of content related to the search phrases the user searched for. In some cases, this means that Google’s automated snippet can actually be better.

In the case of a blog post designed for long-tail, there’s no way to include every possible phrase a user might have searched for in the meta description. Adding a meta description in this case can lead to a situation where the user doesn’t see any reference to their search query in the snippet, and that may discourage them from clicking through as a result.

How can you determine when it’s a good idea to include the meta description and when not to?

This mostly comes down to keyword strategy. For highly focused pages with a very clear topic, a custom meta description is likely the best choice. For posts more along the lines of a “rant,” or in cases where the content covers a very large number of topics, it’s worth considering the possibility that meta descriptions could actually discourage click-throughs.

Less controversially, it’s important to avoid short meta descriptions, as well as meta descriptions too long for search engines. A good target to shoot for is 130 to 150 characters, in most cases.

14. No XML sitemap (or an out-of-date one)

Google crawls and indexes websites much faster than they used to, and this might be why I see so many sites that don’t make use of one these days. But XML sitemaps are still valuable, and I still believe every website needs one. A case study published at Bruce Clay resulted in the percentage of pages indexed increasing from 24 percent to 68 percent as a result of implementing an XML sitemap. Indexation issues still happen, and XML sitemaps still help.

Make sure that you add your XML sitemap via Google Search Console to ensure that the search engine is aware of it.

Is your sitemap up to date? If you’re not using a CMS that automatically updates the XML sitemap every time you update the content, this needs to change. Static XML sitemaps are virtually useless in this day and age, since websites are updated so frequently.

15. Bad use of subheadings

Here are a few issues I see fairly often with subheadings:

  • Using H1 tags for subheadings. Please don’t do this. The H1 tag is meant to serve as a title for the entire page. Using more than one may confuse the search engines as far as the topic of the page.
  • Using subheadings inconsistently. What I mean here is skipping straight to H3 tags without using H2 tags, using H2 tags when you intend them to be subsections of another H2 tag, and so on. The heading tags create a very clear hierarchy for search engines to crawl and understand, so don’t mess with the order in which they’re intended to be used.
  • Using heading tags in the navigation or menu. I’ve seen cases where entire sections of the site shared the same H1 tag because it was included in a common header for the section. This can lead to keyword cannibalization and similar issues. Including subheading tags within the navigation may also lead to confusion of which content belongs to the body.
  • Using bold or size formatting in place of subheadings. While this is less of an issue, I would highly recommend sticking to subheading tags, with the exception of subsections within subheadings. Again, subheadings give the search engines a very clear hierarchy, which can assist them in semantically interpreting your site — a hierarchy which is likely to be less clear if you use size formatting, bolding and so on.

16. Bad use of bold formatting

While correlative studies definitely point to use of keywords in bold formatting having a positive relationship with rankings, it’s very easy to interpret these results in the wrong way.

SEOPressor ran an experiment to see how adding bold or strong tags to their keywords would affect rankings. The experiment was a disaster.

The page’s rankings dropped from rank 64 to rank 84. Obviously, they weren’t testing this on a high-risk page, but the results are fairly definitive, especially since the effect went away soon after the formatting was removed.

Interestingly, another article written by the same author suffered during the same time period, suggesting that you might possibly harm authorship reputation by stuffing keywords into bold tags.

Sycosure ran their own case study in response and saw similar results. After adding bold tags to the primary keyword for one of their articles, it disappeared from the search results entirely. Ultimately, the page did recover before the bold tags were removed, perhaps due to other signs of quality, but the implications are clear.

The lesson here is that bold formatting is probably best avoided as an SEO tactic altogether. It’s certainly useful to highlight content in order to make your pages easier to skim, but it appears to be harmful if it’s associated with your keywords. The positive correlations found in many studies probably have more to do with what search marketers are doing than what is influencing search results.

Over to you

Did you catch yourself making any of these mistakes? No worries — these are incredibly common. What matters is that you fix the problem and put a process in place to keep the problem at bay.

How about you? Any common mistakes I missed?

The post 16 common on-site SEO mistakes you might be making right now appeared first on Search Engine Land.

]]>
Fix your outbound link problem in a single workday /fix-outbound-link-problem-single-workday-278122 Thu, 06 Jul 2017 15:49:37 +0000 http:/?p=278122 Worried that poor outbound linking practices are hurting your site? Columnist Pratik Dholakiya shares his method for identifying and fixing these potentially detrimental links -- without getting rid of the ones that are helping you.

The post Fix your outbound link problem in a single workday appeared first on Search Engine Land.

]]>

Links are the foundation of Google’s algorithm. A naïve reading of the algorithm suggests that you shouldn’t link to anybody else, ever. It turns out that’s not at all true. But your outbound links play an important role in your ability to turn up in the SERPs.

Let’s make sure your outbound link game is on point. Follow the steps below and get yourself situated today.

Background: It’s not just a PageRank issue (and hasn’t been for years)

Google’s original PageRank algorithm was pretty simple. Each page inherits PageRank from the pages that link to it. The PageRank that gets passed from one page to the next is its own PageRank divided by the number of outbound links on the page, minus a damping factor. Basically, it was a simulation of how likely somebody was to land on your page if all they did was randomly click on links.

It wasn’t long before SEOs started hypothesizing that linking to other sites could hurt your PageRank — the idea being that you lost a little bit of PageRank by linking out to other sources, perhaps giving their page a boost at your own page’s expense.

Taken in concert with Google’s suggested limit of 100 outbound links per page (a guideline which has since been dropped) and Google’s push to combat link spam by penalizing sites with “unnatural outbound links,” it’s no wonder SEOs and webmasters began to conclude that outbound links were best avoided (or nofollowed) altogether.

It’s true that spammy outbound links can be detrimental to your site. But there is evidence to suggest that, when implemented correctly, outbound links do not hurt — and indeed can actually help — a site’s rankings. Matt Cutts, former head of the web spam team at Google, made it clear way back in 2009 that “parts of [their] system encourage links to good sites.” Google doesn’t want your site to be a dead end.

While Google Webmaster Trends Analyst John Mueller has since made contrary claims, an experiment conducted by Reboot Online demonstrated results that could scarcely be more definitive. By the end of their five-month experiment, the sites with outbound links to authoritative sites were in the top five positions in Google for a made-up keyword, and the sites without outbound links were in the bottom five positions. These results were also true for a second made-up keyword that they didn’t include in the anchor text of those links.

So if you want to optimize your outbound links for search engines, you may need to cut some links (those to spammy or low-quality sites), but it would be a very bad idea to cut all of them.

Let’s talk about how to identify and fix any outbound link issues on your site.

1. Crawl that site

You’ll need to start by crawling your site to get a list of all of the links. There are a lot of tools you can use to accomplish this, but Screaming Frog is a freemium product whose free version can get you everything you need.

Once Screaming Frog is installed and running, enter your site’s home URL­ in the bar up top:

Press the “Start” button and allow it to crawl your site until the bar to the right reaches 100 percent:

Now, go to the “External” tab:

From here, sort “Inlinks” in descending order, so that the arrow is pointing downward:

The “Address” column now lists the URLs that your external links point to, sorted by the highest number of outbound links on your site.

Next step: Start cutting links.

2. Trim those sitewide links

Our first goal in trimming outbound links is to eliminate as many unneeded sitewide links as possible. These are the outbound links that sit in your site’s navigation, meaning that they link out to other sites on every page of your site (or at least on every page of your site that uses the same navigational template).

When your outbound links are sorted descending by “Inlinks,” sitewide links will sit at the top. They are easy to identify, because the number of inlinks will be comparable to the number of pages on your site.

I wish I could give you a perfect rule that would tell you which outbound links you should cut here, except no, I actually can’t. If things were that easy, it would ruin a lot of innovative marketing strategies. Still, these guidelines should give you an idea of what to trim.

As a general rule, these are the kinds of sitewide links you may want to keep:

  • Links to your social media profiles — provided you’re strategic about this, the profiles are active, and you aren’t listing dozens of profiles. You should consider a link to a dedicated page on your site listing all of your properties if you have more than three or so.
  • Links to places you were recently featured, provided that this remains in rotation and doesn’t start to pile up. Again, if you have more than three of these, a link to a dedicated page on your site would likely be more useful.
  • Links to a parent or sister company, if and only if this isn’t part of a link scheme.
  • Links necessary in order for aspects of the page to function correctly.

These are the types of links you should consider cutting:

  • Blogroll-type links to sites that you like, especially if you have more than three or so, and especially if the sites aren’t widely considered authoritative. The risk that these could be interpreted as part of a link scheme is fairly high, so you should only keep the links if you feel that the user experience argument is very strong. Consider moving them elsewhere if you want to keep them.
  • In general, any sitewide link to another site that isn’t one of your properties needs a strong case in order to remain in place.

These are the types of sitewide links you should almost definitely cut:

  • Any link with anything other than branded anchor text. No keywords, not even partial ones, unless they are part of the brand name, should be included at all. If you feel elaboration is strictly necessary for the user to understand the link, include it outside of the anchor text.
  • Any advertising or sponsored placement at all, unless the link is nofollowed. Never place an advertisement without nofollowing the link.
  • Anything over three links to external sites that aren’t your own properties should almost certainly be cut from the sitewide navigation and placed on a dedicated page elsewhere, even if the links are completely justifiable. Few things signal spam like a navigation full of external links.
  • Any external link that could be confused for a part of the internal navigation. Don’t place external links in tabs or drop-down menus, or in other locations that most people would assume are part of the internal navigation. This is begging for a penalty.
  • Any links that aren’t readily visible to the user, or that share the page background color even if they are visible, should be reformatted to be visible or removed. Anything that could be misconstrued as a cloaked link to an external site is all but certain to get you penalized.

All right, now that you’ve pruned your sitewide outbound links, let’s look at the rest.

3. Cut the non-authoritative links

Your site is bound to have too many outbound links to check entirely by hand (at least in a single workday, as this guide has promised).

So here’s what we recommend. Copy the rest of your link addresses from Screaming Frog and run a batch analysis with Ahrefs. Use the free trial and cancel if you must. It’s worth it.

After signing up for the free trial and going to the batch analysis page, all you need to do is paste your link URLs into the box and click “Start Analysis.”

When the analysis is finished, sort ascending by “DR,” which stands for “domain rank.” Your links will be listed from least to most authoritative.

Now it’s time to start cutting your low-authority links.

Let’s be careful in our language here. I’m not saying you should cut every low-authority link. I’m saying that if an outbound link is low-authority, it needs to be justified. We don’t want to be citing a 13-year-old blogger’s WordPress site for important facts. We don’t want to be citing “fake news” sites. We should avoid citing secondary sources if we can. We want to avoid linking to untrustworthy or spammy sites.

In other words, use good judgment when cutting your outbound links.

It would be misleading for me to give you a domain authority cutoff point. Again, use your best judgment and stop where you feel the links have reached a consistent level of trustworthiness.

For each of these types of links you cut, make a note in a spreadsheet indicating the page where the link was and what it used to link to. This will come in handy in the next step.

4. Spruce up your sources

This is the piece that most outbound link strategies are missing.

You can’t just cut outbound links and expect this to help your authority with the search results. Remember the first section? It’s not just about PageRank anymore, and simply hoarding your links isn’t going to do you any favors. The tests show the opposite. Cutting too many outbound links may hurt your own rankings.

Especially when it comes to informational content, like what you would find on a blog or magazine, outbound links are crucial. Anybody who’s been to college knows that you should cite everything, and use authoritative sources.

This is mentioned explicitly in the search quality evaluator guidelines. When looking for “Low Quality Main Content,” they are asked to look for content that is “Failing to cite sources, or making up sources where none exist.”

For every link you cut, you should consider replacing it with an authoritative link.

Here I’m speaking strictly in terms of citation. If the original link wasn’t being used as a citation to back up a claim, there’s a good chance you don’t need to replace it. But if you are stating a fact without a citation, adding an authoritative source is highly recommended.

Take a trip through your spreadsheet and collect your replacement sources all in one go. Then go through and update your site with the links. It will go much faster this way.

5. Get a system in place

Finally, it’s time to get a procedure in place to keep your outbound link quality ahead of the game for the future. Add notes such as these to your content guidelines, and make sure they are enforced:

  • Factual statements should be sourced with a hyperlink.
  • Link to the original source if possible.
  • If the original source can’t be found, link to a trusted publication.
  • Links to less well-known or widely trusted publications should be to attribute opinions, rather than to source facts, unless the publication is the original source.

There you have it. Your outbound link game is on track, your reputation is solid, and your future is looking bright.

Have thoughts on this? Do not hesitate to Tweet me @DholakiyaPratik.

The post Fix your outbound link problem in a single workday appeared first on Search Engine Land.

]]>