backlinks.com
More
White Label SEO
seo in dubai
https://relativityseo.com/seo-services/ Brian Harnish – Search Engine Land News On Search Engines, Search Engine Optimization (SEO) & Search Engine Marketing (SEM) Tue, 24 Mar 2020 15:58:07 +0000 en-US hourly 1 https://wordpress.org/?v=5.4.1 Pro Tip: How to fix 3 not so obvious crawl errors /pro-tip-how-to-fix-3-not-so-obvious-crawl-errors-331226 Tue, 24 Mar 2020 15:57:45 +0000 /?p=331226 Index bloat impacts performance as well as misconfigured trailing slashes and soft 404s.

The post Pro Tip: How to fix 3 not so obvious crawl errors appeared first on Search Engine Land.

]]>
Everyone hates crawl errors. They show up without warning and can cause indexing issues.

In Gary Illyes’ (Google Webmaster Trends Analyst) Reddit AMA last year, he explained you must make your site crawlable:

“I really wish SEOs went back to the basics (i.e. MAKE THAT DAMN SITE CRAWLABLE) instead of focusing on silly updates and made up terms by the rank trackers, and that they talked more with developers…”

These tips will show you how.

How to find and fix index bloat

Index bloat means you have more URLs indexed than physical pages.

If it’s on a large enough scale, it can negatively impact performance. If severe enough, it’s a waste of your crawl budget.

Use the site: operator in Google search to find it. If the number of results is larger than the number of URLs you have, it’s an issue. Don’t include a space.

The operator should be entered into Google like this:

Misconfigured 4xx errors and soft 404s

With normal 404s, 301 redirecting them to working URLs is a good solution. What if 404s are not the normal 404s, though?

It’s a common issue. A page without content is a soft 404, even if it shows a 200 OK status.

In Screaming Frog, default word count reflects every single word on the page, not just the main content area. You must use Excel to determine “no content” after exporting your crawl data.

Create a column in Excel next to Screaming Frog’s standard word count, and subtract the total word count of your headers and footers (any sidebars, other text, etc.) from the total word count displayed.

The following method is more reliable but time-consuming to manually examine your pages to find physical text content.

Misconfigured trailing slashes

Not all URLs are created equal. There is a difference between .htm, .html, and using a forward slash (/). The first two are file names. The last is a folder.

When all load at once, you’re serving three URLs with the same content.

Serving multiple indexable versions leads to crawl errors and duplicate content issues.

If this issue exists on your site already, redirect all URL versions to one primary version, so only one version loads.

Leaner is better

Don’t just go for more content, not caring about these details. They’re important to your site. Create a better, leaner site with fully optimized crawlability. Your users will thank you.

Pro Tip is a special feature for SEOs in our community to share a specific tactic others can use to elevate their performance. You can submit your own here.

The post Pro Tip: How to fix 3 not so obvious crawl errors appeared first on Search Engine Land.

]]>
Five free Chrome extensions for SEO practitioners /five-free-chrome-extensions-seo-practitioners-257274 Tue, 30 Aug 2016 15:15:15 +0000 http:/?p=257274 Columnist Brian Harnish details 5 free Chrome extensions he uses on a regular basis, for SEO tasks ranging from screen shots to checking links to content analysis.

The post Five free Chrome extensions for SEO practitioners appeared first on Search Engine Land.

]]>
google-chrome-logo-fade-1200

Did you know that you can increase your SEO task efficiency by using extensions in Google Chrome? If you perform most of your SEO tasks manually, these extensions can help reduce the headache of repetitive tasks and help you make your work day run more smoothly.

Following are five free Chrome extensions I use on a daily basis to help increase productivity. From screen shots to checking links to content analysis, it’s all here!

1. Nimbus Screenshot and Screencast

As the name suggests, Nimbus is a handy screen shot and screencast tool. This tool is handy for capturing and annotating screen shots, which can help when putting together a report for a client, convincing a client to take a particular step or showing them errors we are seeing on our end (that they may not be seeing on theirs). You can make annotations right within Nimbus: arrows, text, image blurring for sensitive materials — it is possible to do all that with this tool.

In addition, you can use Nimbus to create videos. Say you wanted to create a video that shows the client a particularly detailed SEO concept without having to resort to a webcam or webinar session. This extension provides the perfect way to do it, right from your browser. You can also create video demonstrations of SEO tactics or concepts to incorporate into conference slide decks and enhance your presentations.

Nimbus Screenshot differentiates itself from similar tools in that you can take full-page screen shots. I have found it necessary to take full-page screen shots throughout my SEO work, so I recommend having at least one tool that performs this function.

2. Check My Links

Check My Links is a great Chrome extension that makes checking on-page links much easier and more efficient. With just a few clicks, it is possible to identify how many links there are on a page, not to mention which ones are working properly and which are returning 404 errors.

This tool is most helpful for identifying broken links — both on your own website and others. The latter can come in handy for SEOs using the “broken link building” tactic, wherein you find relevant pages linking to resources that no longer exist, create a new resource for them to link to and reach out in the hopes of getting a link to your content.

3. Word Count

Are you performing an SEO audit in which word count can be a critical component of that audit? Using the Word Count extension, you can perform a quick, high-level overview of word count on certain webpages.

Doing this quickly can help you find quick SEO wins and low-hanging fruit if you need to put your skills on the line to win the client. Having this extension in the browser sure beats having to copy/paste a paragraph or two into Word and perform the word count there.

4. Open Multiple URLs

Open Multiple URLs is one of those extensions that truly beautifies your workflow. It is also a great way to help speed up those processes during link remediation — you know, where you have to open hundreds of URLs at once. When you open the extension and load the tabs with URLs, you can use keyboard shortcuts to speedily move through every window (Ctrl + Tab moves forward, and Ctrl + Shift + Tab moves back through the previous tabs you were on).

Please note: I recommend checking the box for the “load URL only when the tab loads” option that shows up when you open up the extension. The reason for this is that some hosts (not all, but hosts like Cloudflare, for example) will see activity from a single IP visiting all of its related sites as spammy, and you could end up being banned from their website networks if you are not sure what you are doing.

The reason I bring this up is that some large-scale spammers use hosts like Cloudflare now in order to host their sites and build link networks for their own spammy reasons.

It is my suggestion to use a highly customized proxy machine in order to use this extension, so that you can avoid detection from these other networks. Failing that, using a VPN and an IP switcher to hide your tracks can help you avoid detection and other issues.

At the minimum, before doing anything, especially if you are unsure how to do this, I suggest consulting a tech friend you know is able to do all of this and more just to make sure you are doing everything correctly. While the odds are slim that you will be banned from a major (or your favorite) website, it never hurts to be proactive and take care of these things ahead of time.

5. User-Agent Switcher for Google Chrome

User-Agent Switcher allows you to view how your website looks in different web browsers and on different devices, which can be helpful when troubleshooting website and SEO errors that you might not normally see.

For example, say you needed to take screen shots of Google rankings on a mobile device vs. Google rankings on a desktop device. This extension lets you perform that task right on your desktop, without having to fumble clumsily through your mobile and tablet devices to send the screen shots to yourself manually.

Or, say you needed to take a quick screen shot on your desktop of an SEO issue that is visible on your mobile device but not visible on your desktop. This extension allows you to switch to that mobile user agent while on your desktop, and then you can use a program like Nimbus Screenshot to take the full screen shot.

All of this is being accomplished without leaving the safe confines of your desktop! Pretty sweet, eh?

Increase productivity, efficiency, and become an SEO wizard

By using these extensions, it is possible to increase your SEO efficiencies dramatically and take hours of work out of your day as a result. I highly recommend experimenting with them and working through different tasks that you normally perform manually otherwise. By doing this, you can help increase your SEO efficiencies exponentially and make your day go by much faster.

Do you see something you do every day that is not addressed by these (or other programs existing on the market)? It may be worth it to think about how you spend your day and figure out how to automate these tasks and how you would automate such tasks using a custom solution.

You don’t have to be a programmer to have a program made. Use your own ingenuity and think through the problem, then find a programmer you trust to code that solution. Your next solution could just be the next SEO sensation.

The post Five free Chrome extensions for SEO practitioners appeared first on Search Engine Land.

]]>
Link profile analysis: How to prevent penalties by being proactive /link-profile-analysis-prevent-penalties-proactive-254512 Tue, 02 Aug 2016 15:42:51 +0000 http:/?p=254512 Columnist Brian Harnish tackles how to analyze links from a Google Webmaster Guidelines perspective, and how to make sure that your client's linking activities don't cause them to lose everything they have gained.

The post Link profile analysis: How to prevent penalties by being proactive appeared first on Search Engine Land.

]]>
business-analysis-charts-data-ss-1920

As most SEOs know, links are still extremely important to ranking highly in the Google search results. In fact, a recent study performed by Backlinko showed that “the number of domains linking to a page correlated with rankings more than any other factor.

Thus, it makes sense for most all large companies with a significant website presence to implement a regular link pruning schedule. At ymarketing, we call our process of link pruning “link remediation.”

The normal way most sites handle a penalty is reactionary in nature. The SEO wakes up one day and finds the scariest of emails an SEO can receive in their inbox: They have been hit with a manual penalty.

Once they notice the manual penalty email, they scramble to analyze their site’s backlink profile. They find they have indulged in one too many link exchanges or article marketing techniques.

Depending on how bad it is, it can take weeks or even months to clean up the link profile before submitting a reconsideration request. At a past employer, I spent almost eight months helping to remove or disavow over 260,000 backlinks to remove a manual penalty. The effort was successful, but it was an astronomical undertaking, and the business lost untold revenue from organic search traffic during those eight months.

Usually, a manual penalty is only reserved for the worst of the worst in violations of Google’s Webmaster Guidelines. A majority of sites affected by ranking issues related to bad links will be faced with an “algorithmic penalty” instead, meaning that the site will lose search visibility as a result of an algorithm update such as Penguin. In these cases, the SEO will not receive a notification from Google.

One way to identify if your site’s been hit with a Penguin penalty is to check your traffic in Google Analytics (or whatever your primary Web analytics platform is). If you see a sharp drop in just organic search traffic one day, it can be cause for alarm. You can use a tool like Panguin to overlay your Google Analytics data with major algorithm updates to see if you’ve been affected.

There are ways to keep issues like this from happening, so let’s take a look at what you can actively do to make sure that your link profile is always in tip-top shape. One of the first steps I recommend is creating a regular link remediation schedule that will help you identify and eliminate problem links before they become an issue.

How to identify bad link profiles

Setting up a regular maintenance schedule for checking your link profile is always a good idea, especially if you are working for a large national brand. When you work on huge sites, it can be a challenge to monitor all the resources in the company, including:

  1. Who is acquiring links?
  2. What links are they acquiring?
  3. When are they acquiring links?
  4. Where are they acquiring links from?
  5. Why are they acquiring these links?
  6. How are they acquiring these links?

All of these can impact how Google perceives your link profile. In order to keep manageability to reasonable levels, I recommend having one person in charge of this process from month to month or quarter to quarter (however you want to do it).

The following is a brief listing of what constitutes a bad link based on Google’s Webmaster Guidelines, as well as examples of each:

Guideline violation: “Buying or selling links that pass PageRank”

Buying or selling links for the purpose of impacting search engine rankings is considered a “link scheme,” and Google frowns upon this. The buying and selling of links can take several forms.

“Exchanging money for links, or posts that contain links”

  • These are not always easy to identify, but look for any page on a site that is an obvious “buy a link from me” page.
  • To identify a paid blog post, look for multiple followed links to the same website that have been placed using keyword-rich anchor text.

“Exchanging goods or services for links”

  • This happens a lot in certain industries, such as the health and fitness industry. Free samples will be given in exchange for a link, or other free products will be given in exchange for a link. Alas, it is usually not possible to identify such an affiliation because the blogger doesn’t point out their affiliation with these types of links.

“Sending someone a ‘free’ product in exchange for them writing about it and including a link”

Guideline violation: “Excessive link exchanges”

  • Basically, a link exchange means “link to me and I’ll link to you.” These also include partner pages created exclusively for the sake of cross-linking.
  • A good example of this is any site in any industry that acquires links from partners who also link back to the blog just for the sake of links, and nothing else that may add value from an SEO perspective.

Guideline violation: “Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links”

  • One way to identify these types of links is to look at whether they come from an obvious article marketing site (anything with the word “article” in the main domain is usually a good guess). Guest postings will be a bit less obvious and mostly cannot be identified, unless the guest comes straight out in the post and says, “I am a guest of so-and-so’s blog.” It is impossible to catch everything, but the likely situation is that links like these are so minimal that they may likely never be a problem. But if they are, they will show up and be obvious when you put together your link profile.

Guideline violation: “Using automated programs or services to create links to your site.”

  • This Google Webmaster Guideline includes using anything like ScrapeBox or similar services to create thousands of spammy links (often in a short period of time).
  • Common types of links that are indicative of this kind of violation include forum profile links and blog comment spam. Identifying these should be fairly simple when performing a link analysis on your backlink profile.

Guideline violation: “Text advertisements that pass PageRank”

Guideline violation: “Advertorials or native advertising where payment is received for articles that include links that pass PageRank.”

Guideline violation: “Links with optimized anchor text in articles or press releases distributed on other sites.”

  • For example (from Google’s Webmaster Guidelines): There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

Guideline violation: “Low-quality directory or bookmark site links”

  • This includes any link from any directory that was done just for the sake of the link, without adding any value from an SEO perspective. Most all directories created within the past 10 years that feature only links without adding any other value can be considered a violation of this guideline.
  • The major exceptions, however, are local SEO directories. If the link is added to these types of directories as part of a local SEO campaign, this should only affect the local part of the algorithm and should not impact normal algorithmic link acquisition activities, unless something else is seriously wrong with your link profile.

Guideline violation: “Keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites.”

  • For example (already available on Google’s Webmaster Guidelines): Visitors to this page: 1,472 car insurance

Guideline violation: “Widely distributed links in the footers or templates of various sites.”

  • The key here is “widely distributed,” meaning that it’s done on such a large scale as to outnumber all other links that have value in that link profile.

Guideline violation: “Forum comments with optimized links in the post or signature.”

  • For example (already on Google’s Webmaster Guidelines):
    Thanks, that’s great info!
    – Paul
    paul’s pizza san diego pizza best pizza san diego

Google further advises in their Webmaster Guidelines that PPC (pay-per-click) advertising links that don’t pass PageRank to the buyer of the ad do not violate their guidelines. They recommend using nofollow to prevent PageRank from passing. In addition, they also recommend “redirecting links to an intermediate page that is blocked from search engines with a robots.txt file.”

Furthermore, their main recommendation for avoiding bad link profiles that can cause a penalty is to “get other sites to create high-quality, relevant links to yours.” The way they recommend doing this is to “create unique, relevant content that can naturally gain popularity in the internet community.” Their very definition of this kind of content means that you get links from people who create editorial content “vouching” for your site “by choice.”

Examples of bad link profiles

While there are many ways to gain bad links, it is not always obvious what is a good profile and what is a bad profile. Let’s take a look at the following examples. See if you can identify bad link profiles.

Here is our first example:

Bad Link Profile Example 1

It is pretty obvious that the above is a bad link profile. The vast majority of links to this profile are composed of article marketing site links with keyword-rich anchor text pointing back to the website.

This does not bode well for the site. Even if they are not currently under a penalty, chances are high that they will eventually lose rankings in Google, either algorithmically or through a manual penalty if the article marketing sites are far too excessive.

Let’s take a look at the next example:

Bad Link Profile Example 2

Now, this is a bit less obvious. Here, we have a site that has a bunch of random links with branded anchor text, and no discernible pattern, comprising about 40 percent of the link profile. In most cases, SEOs would consider this to be a potentially healthy link profile, right?

Wrong. See, we have 10 percent low-quality directories, 10 percent partner sites just for the link, 10 percent article marketing sites and 30 percent excessive link exchanges. This now comprises 60 percent of the entire link profile, compared to 40 percent of all the good links.

It will be necessary to perform link remediation and removal on the bad links just to make sure the 60 percent does not impact the site in a negative way.

Let’s take a look at the next example — it’s a tricky one:

Example of a Bad Link Profile 3

This is a bad link profile. It has 50 percent of its links coming from article marketing sites and 50 percent coming from .GOV links and .EDU links. Though links from these sites are often considered to be good, the fact is that even without the questionable links from article marketing sites, this link profile looks completely unnatural. You want to have a good variety of backlinks coming from a variety of different sites.

Setting up a workflow to manage your brand’s linking activities

Ideally, you should be performing regular link remediation in order to find and remove bad links from your site’s link profile. This activity will help you find links that are harmful to your site’s rankings. It’s crucial to find and remove these links to ensure that your site never falls under a penalty. To that end, establishing a workflow structure that works for you will be important to ensure the longevity of your website.

Your workflow should contain the processes and tools that you expect to follow and use on a monthly, quarterly, or even yearly basis for link remediation. If you do not have a process in place, expect to spend some time creating the proper process documents in order to achieve the optimal workflow desired from your ongoing link remediation efforts.

Rome wasn’t built in a day, and neither was a high-quality link profile. A general rule of thumb is that Google will expect you to spend at least as much time removing the bad links as you spent creating them.

In addition, I recommend setting up your link remediation activities to occur in regular intervals. If you find that your link profile is getting massive amounts of links every month, you may want to set it up monthly. If you find that your link profile is not getting that many links, you may want to set it up for quarterly or yearly. It all depends on what you find most effective for you in the long run, and this will require some experimentation on your part.

Tools that will be extremely handy for this process include Link Detox, Ahrefs, SEMrush, Raven Tools and the usual Google Search Console link export feature.

Excel will be an important tool for managing your links and creating client-facing documents they can use to examine your efforts in link analysis and cleanup.

The process of link analysis and cleanup

First, you will want to compile a list of all your links from at least three different tools in order to find and identify all of the links impacting your link profile. Please note that just exporting links from Google Search Console will not identify every possible link that is impacting or will impact your link profile.

Following is the process I use. Yours may look a little different, depending on what tools you use and have access to:

  1. Export links from Google Search Console, Ahrefs, SEMrush, and/or Raven Tools to multiple Excel spreadsheets.
  2. Compile all of these links into one Excel spreadsheet.
  3. Remove duplicates. For sites with thousands upon thousands of backlinks, using the “Remove Duplicates” function in Excel tends to break down. In this situation, I suggest using conditional formatting and filtering as suggested by Marie Haynes in her article, “5 Spreadsheet Tips for Manual Link Audits.”
  4. After compiling the spreadsheet, import it into Cemper’s Link Detox tool. This tool will acquire useful data for each link and compile it into one easy-to-use report. For example, HTTP-Code lets you see the HTTP status code that is being output by the page the link is on. Link Loc (or Link Location) tells you the location of the page it is on (footer, content and so on). This is very useful for quickly identifying bad links without having to visit the site. However, it is always a good idea to visit the site manually.
  5. Using this information from Link Detox, you are now prepared to begin your analysis. It is recommended to perform a manual link review in order to rule out any small errors the tool may have missed or to find problems the tool may not otherwise have identified. Make your audit detailed, and categorize bad links based on spam and other obvious issues, including the way in which they violate Google’s Webmaster Guidelines.
  6. The next phase of link remediation is link removal. Once you have identified all of the bad links in the last step, it will now be necessary to move forward with contacting the webmasters of these links one by one. Remember, Google wants to make sure that you spend as much time removing links as you did acquiring them. Be sure to keep a log of the sites that you contact so that you can show it to Google should they ever request to see the log. As a general rule, you should contact webmasters two or three times per round of link removal. After the third attempt to contact them, if you have received no response, you can then proceed with disavowing that link.
  7. After having contacted all of the webmasters behind your links, it is now time to proceed with disavowing any you could not get removed using Google’s disavow tool. Please note that you should only be disavowing the links that are spammy and considered bad links for your site’s link profile. You should not be disavowing any good links unless you found later that your good link turned out to be spam.

Final thoughts

Identifying and removing bad links can be a time-consuming process. However, through setting up a link remediation process, you can quickly and easily deal with any bad links that may be a part of your link profile. By being vigilant and watching your link profile like a hawk, you can prevent issues that may otherwise wreak havoc on your clients’ search performance.

The post Link profile analysis: How to prevent penalties by being proactive appeared first on Search Engine Land.

]]>
How to audit canonicalization and ensure it helps, rather than hinders, your rankings /diving-deeper-auditing-canonicalization-issues-250866 Tue, 07 Jun 2016 14:13:06 +0000 http:/?p=250866 Columnist Brian Harnish discusses in detail canonicalization issues that may not normally be covered in an SEO audit -- and how to effectively address them.

The post How to audit canonicalization and ensure it helps, rather than hinders, your rankings appeared first on Search Engine Land.

]]>
search-investigate-magnifying-glass-ss-1920

For those who are unaware, “canonicalization” refers to the practice of making sure that for every instance of duplicate content on a site, one version is specified as the “preferred” or “source” URL to the search engines. Basically, you are telling Google, “Of all the URLs that contain this content, this is the URL that you should consider the authority. No other.”

When a proper audit identifying canonicalization issues is not performed, you can run into snags later when Google identifies your site as being a source of duplicate content, which can lead to algorithmic ranking losses, or even manual penalties.

Canonicalization issues generally occur when attempted canonicalization is not executed properly. Following are some common canonical issues that, once resolved, can result in rankings boosts to the site because of consolidated link equity.

Issue: Home page does not canonicalize properly

Many websites wind up with multiple versions of the home page that resolve on different URLs, such as:

  • http://www.domainname.com/
  • http://domainname.com/
  • http://www.domainame.com/index.html

When you have many different versions of the home page — all of which have inbound links pointing to them — this can cause canonicalization issues that will impact rankings.

In order to fix this, choose your preferred home page URL and 301 redirect all the other versions to it. For the www vs. non-www versions, take it a step further by specifying your preferred domain in Google Search Console.

Implementing redirects to the preferred version of the home page will consolidate your link equity, which can potentially enhance your search engine rankings.

Issue: URLs don’t resolve to a single case

This is a big one. URLs that don’t resolve to a single case can result in duplicate URLs, leading to duplicate content issues that put your site at risk.

If you’re a beginner SEO, it is important to consider that duplicate content doesn’t always mean that the same content is duplicated from page to page. It happens quite often that URLs cause duplicate content issues simply by existing in the first place.

Here are some examples of a URL that doesn’t resolve to a single case:

  • https://www.domainname.com/page.html
  • https://www.domainname.com/pAgE.html
  • https://www.domainame.com/PAGE.html
  • https://www.domainname.com/PaGe.html

If you input all of these variations in the address bar of your favorite web browser, they will all bring up the same page. This can become a problem because without proper configuration, Google will spider and index these pages, resulting in non-canonical pages being indexed.

The best way to fix these issues is to 301 redirect all of these URLs to the main canonical URL that you choose. It may be beneficial to perform a server side redirect using Apache, .htaccess, or whatever server technology your server uses, to avoid adding 301 redirects all over the place. Over-redirecting with 301 redirects can also cause problems.

Alternatively, you can also use the rel=”canonical” tag to specify the canonical version of the page. That means putting a tag on the page that looks like this:

<link rel="canonical" href="https://www.domainname.com/page.html" />

Issue: IP address doesn’t canonicalize

In a perfect world, your IP should properly canonicalize back to the main domain name of your site. If it is not, you risk indexation issues because of the inability for search engines to correctly determine which of your website’s URLs they want to index. In addition, duplicate content issues can arise from a search engine wanting to index both your IP address and the URL of the website.

If you determine that you have IP canonicalization issues, speak with your server administrator and discuss potential solutions to the issues.

Issue: duplicate URLs

Duplicate URLs can be just as dangerous as non-canonical URLs. When you have duplicate URLs, and they have no canonicalization in place, Google will have no idea which version to index. This can lead to the indexation of duplicate URLs serving the same content, diluting your link equity and page authority.

Duplicate URLs can take the form of the following, usually resulting in multiple versions of the same URL regardless of the file name extension:

  • https://www.domainname.com/page.html
  • https://www.domainname.com/page.htm
  • https://www.domainname.com/page.aspx
  • https://www.domainname.com/page/

The best way to fix all of these is to implement a sitewide redirect redirecting all of the duplicate URLs back to the main canonical URL. This will help consolidate link equity and result in a performance boost overall.

Issue: URLs can be accessed through both secure (https) and non-secure (http) versions

This appears to be a pretty simple problem, but you would be surprised how often this creeps up in website audits. Usually this results from improper setup of the non-secure and secure version on the server. Google Search Console does not play a role in terms of how URLs are accessed via the browser.

The best, simplest way to determine this issue is trying to access both the http:// and https:// versions of your site in the browser. If they both load just fine, then you have some issues that should be cleaned up as quickly as possible.

The best way to avoid this problem is to make sure you properly make the switch from HTTP to HTTPS to begin with. (Patrick Stox has written an excellent and comprehensive guide on how to do that here.)

It is my recommendation to get the proper highest-level secure certificate you can, and make sure it comes with wild card options. This way, you do not cause any unforeseen canonicalization issues arising from not having the proper secure certificate with the right settings installed.

Issue: trailing slash canonicalization

Similar to duplicate URLs, improper trailing slash canonicalization can also become an issue. For example:

  • https://www.domainname.com
  • https://www.domainname.com/

or

  • https://www.domainname.com/page-name/
  • https://www.domainname.com/page-name

If you have been promoting your website using versions of your URLs both with and without the trailing slash, you could cause indexation and duplicate content issues as a result. Choose one format (I recommend the non-trailing slash version) and stick with it in all of your link building and other promotional efforts. Then do the following:

  1. 301 redirect all variations of the URL using a wildcard redirect back to the canonical URL, and/or
  2. Set the canonical tag to always point to the non-trailing slash version of the page.

The redirect is the preferred solution, but using both is the best option because it removes any ambiguity on Google’s part.

Dive deeper into your audit to find major issues

Canonicalization issues can be a major source of headaches for many SEOs, but if you dive deep enough, you can find and fix many issues plaguing your client’s site. In addition, focusing on these areas can give you a great performance boost that you may not otherwise have been able to obtain with just the usual on-page SEO.

This is because canonicalization factors impact link equity, which, when managed properly, can translate into a major performance boost for your website.

The post How to audit canonicalization and ensure it helps, rather than hinders, your rankings appeared first on Search Engine Land.

]]>
Thinking smarter: Take your SEO work to the next level /thinking-smarter-taking-seo-work-next-level-246704 Tue, 19 Apr 2016 15:55:44 +0000 http:/?p=246704 Columnist Brian Harnish explains how you can streamline your workload and improve client relations through education, planning and communication.

The post Thinking smarter: Take your SEO work to the next level appeared first on Search Engine Land.

]]>
computer-laptop-woman-thinking-ss-1920

Search engine optimization is a complex marketing discipline, and it can be a challenge to perform high-level, high-quality SEO work every single day. When you’re working hard to get the best results for your client, it can be tempting to cut corners here and there to meet (or even beat) deadlines.

While this approach may be effective sometimes, ultimately, it can create more work for you down the line. By taking the time to be thorough, you can anticipate and avoid future obstacles that impede progress and create headaches for you and the client.

Let’s discuss a variety of ways that you can think smarter in order to create a more streamlined working experience.

Ensure you have all necessary materials before you start the SEO project

We’ve all been there: You begin to work on an SEO project and find out that you are missing a critical part of what makes the project tick. If you deal with large teams and international brands, chances are that part of the project slipped through the cracks. There are ways to help keep this issue from cropping up time and time again.

If you deal with highly complex projects on a frequent basis, and these projects require client material that takes days or even weeks to obtain, it makes sense to ensure that everyone involved on the client side understands the impact this client material makes.

In these cases, it may be beneficial to spend a couple of hours creating a road map for the client, which they can use as a reference throughout the project. That way, they can have it handy in the future. Create the road map prior to project launch, and then refer to that road map when requesting materials from the client.

The road map should not only outline what materials you might need from the client and by when, but also provide some basic SEO education so that clients can understand why you need a particular item at a particular time. It can be challenging for non-SEOs to remember things like dynamic URLs, other types of project requirements and client-side information that no one is ever realistically going to be prepared for.

Create a realistic project timeline, even if it’s slower than the client prefers

I get it. For those SEOs who are client-facing and have to make sure that their clients are happy every step of the way, it is possible to get caught up and think, “Man, we need to beat Competitor 1 and Competitor 2, so we need to do y and z in a much faster time frame to accomplish this! I will tell the client this, and they will be overjoyed!”

Unfortunately, all too often, the faster approach leads to unmet expectations, poor-quality project deliverables and unrealistic client expectations for next time. When you are forced to explain to the client that an initial project timeline was not right because of things like project scoping, it can be very awkward.

Even though you want to, and may be capable of taking the fastest approach and turning in everything 100 percent, errors will always creep in and make your project less than it can be.

In other words, always ask: Will this project timeline result in everyone being happy (me most of all, by keeping my sanity)? Or will this project timeline result in a pissed-off client because they did not receive everything they expected to receive?

If you’re an SEO manager, 15 minutes of detail is always better than two hours of yelling

Let’s take this scenario. You’ve just hired a new employee. It’s their first day, and you take a few minutes to hastily explain the rules of the department. He smiles and nods, seeming to understand.

For around eight months, everything is great. Then you look at project completion times, and something isn’t right. He is supposed to be pretty fast, but the project is behind schedule, and his hours are through the roof. Where did things go wrong?

When you speak to the employee, he says that he has done everything the way you explained on his first day. Or so he thought. You get angry. You yell. You spit. You might even curse. But at the end of the day, this problem could have been avoided if you’d spent adequate time in the beginning explaining certain instructions in more detail.

Always ask yourself: Did I adequately explain our SEO processes? Could someone interpret what I said incorrectly? Even long-term employees might misinterpret instructions if they’re not explained in depth.

Even when thinking about the instructions, ask yourself: Is a deeper level of detail required so that everyone on the team can perform at their highest level? If that is the case, then it is always better to spend 15 minutes on that level of detail than spending two hours yelling at an employee for the mistake that you made in presenting those instructions.

Always be sure to ask: Am I providing these instructions for my benefit (less work) or for the employee’s benefit (greater understanding)?

Don’t fall into the long-term trap of industry rot

Industry rot can happen to the best of us. It occurs when people have been in the industry for a long period of time. We get to a level where we forget that others do not know as much as we do, so we leave out crucial details that may be necessary for the successful execution of a project, because to us, they are details that “everyone should know.”

I’ve been guilty of this, too, so I am constantly checking myself to make sure the information I provide is enough to be valuable, perhaps even impressive in scope. In addition, I am always checking to make sure I provide the very basic levels of info that are necessary for the inquiry and at the same time ensure that my communication is never condescending to anyone who may read it (another minor problem stemming from industry rot).

Client and employee communication can be a delicate balancing act. You want to make sure that everyone is mindful of what they are presenting, so that proper execution of the project can proceed. Industry rot can lead to a little too much ambiguity of detail. This ambiguity of detail can lead to the failure of a project to reach the proper final resolution stage where everything should come together nicely.

Some SEOs do have the problem of industry rot. We make general assumptions that our new client or employee associate outside the industry knows everything we do, and we gloss over details that are crucial to that project’s success. Or we hate details so much that we don’t bother to make a thorough assessment of the client website’s current status.

Some issues that can arise under the industry rot banner include:

  1. Not knowing what deliverables will be required for a project. Without reviewing the details of a particular client’s website or SEO status, we can’t identify sections of a project that might take more or less time than anticipated. For example, perhaps a site widget is generating dynamic URLs for portions of the site that you expected would have static URLs. Optimizing these URLs might then take longer than expected, as you try to figure out how to work around (or with) this widget.
  2. Using technical terms that are basic to us but jargon to the client. Meta descriptions. Title tags. Alt text for images. As experienced SEOs, these concepts are generally well ingrained into even the most seasoned practitioners. However, assuming your client understands these terms can lead to important details being glossed over. So, how much should you explain? If you know the client well, this can be a judgment call to make in the interest of communication efficiencies. If it is a new client, it’s best to explain industry terminology so that the client is aware of these items and their impact on project deadlines and the final outcome of the project.
  3. Not understanding what is required to obtain necessary client materials. Say you have a client that requires legal compliance to sign off on any website copy optimizations due to advertising regulations (common in the pharmaceutical, finance or legal industries). Late materials can impact the deadlines of these projects significantly, so factoring in time for these legal reviews is crucial to setting expectations and delivering results.

When you fail to communicate with the client or review the details of the project in full, it creates problems. It can lead to awkward client conversations down the line about having to extend deadlines because of a lack of oversight; it can lead to the SEO turning in inferior work with a deadline crunch; and it can lead to the internal team dynamic of forcing push-back in order to obtain proper timelines for completion of the SEO evaluation.

Step out of your comfort zone and always be learning

At the end of the day, taking your SEO work to the next level also means attending industry conferences, expanding your SEO knowledge and developing relationships with industry partners. Stepping out of your comfort zone and tumbling down the rabbit hole is necessary in order to grow. Keep expanding your thinking beyond the traditional.

Stay updated on the industry by reading on a regular basis, keep learning new tools, take webinars and attend conferences. Learn from your successes and failures by developing case studies around projects you’ve completed. Doing this will allow you to be more confident and effective, both in your client communications and in your work itself.

As an SEO leader, if you don’t step out of your comfort zone, you will be left in the dust by the more seasoned industry veterans. Reach your goals beyond that comfort zone, develop new ones and exceed the old ones.

How else can you expect to take your work to the next level?

The post Thinking smarter: Take your SEO work to the next level appeared first on Search Engine Land.

]]>
What your teacher didn’t tell you about optimizing site speed /teacher-didnt-tell-optimizing-site-speed-244128 Tue, 15 Mar 2016 14:42:22 +0000 http:/?p=244128 Site speed impacts your search engine ranks, so how can you make improvements? Columnist Brian Harnish details a few ways to decrease page load time for your top-performing pages.

The post What your teacher didn’t tell you about optimizing site speed appeared first on Search Engine Land.

]]>
site-page-speed-ss-1920

Despite site speed being a ranking factor in Google search results, fast websites aren’t the norm. Your site likely has room to improve. By observing minor details, it is possible to significantly decrease web page load time — and consequently increase SERP performance.

Let’s take a look at some of the less common methods to decrease page load time for better performance in the SERPs.

Assess your current load time performance with Google Analytics

If you have Google Analytics set up on your website, finding out how your pages perform should be a relatively easy chore. Simply navigate to Behavior > Site Speed and review the various reports contained therein.

The Page Timings and Speed Suggestions reports will show your top pages, along with their performance stats, plus suggestions for improving page speed. These reports will help determine the pages you want to prioritize.

Once you have assessed your current page load times with Google Analytics, you will want to analyze the factors of your site that are causing issues. Are non-optimized images the primary culprit? Perhaps it’s overly bloated code? A bad server? Or all three factors at once?

Attacking each of these issues in phases, as budget and priorities allow, will help you assess exactly how much each factor impacts your site’s page speed — and how much fixing it improves your site’s performance in the SERPs.

Pre-load all page-level elements where applicable

Every little bit helps, right? As many website audits as I have performed, I am always amazed to see that there isn’t at least one pre-loading script on the site. It’s not that hard to code a JavaScript pre-loader, and it concerns me that such an easy part of on-page optimization is so often overlooked.

By pre-loading on-page elements like images, you can reduce the load time of your site significantly and help increase its overall performance. You run no risk of anything negative happening to your site on Google as a result, so why not?

In addition, there are ways to pre-load page-level elements with CSS, as shown in this example. Where there are CSS alternatives, it is a web best practice to use CSS over JavaScript. Why? Because JavaScript presents problems when people who visit your site have JavaScript turned off.

If your Google Analytics account does not show any visitors with their JavaScript turned off, do you need to worry? Yes. You never know when that random tech-savvy visitor will show up on your site with their settings set that way.

That alone is reason enough to utilize considerations for as many browsers and platforms as possible, so long as budget, priorities and project scope allow.

Make sure all images are properly optimized

It is a well-known industry best practice to ensure that all images are properly optimized. This means that you should not use 2.5 MB JPGs on the page, crunched into a 150 x 150 pixel image. You must ensure that all pixel information is properly crunched in a program like Adobe Photoshop before you upload your image.

If this step is not performed, what will happen is that you will have a 150 x 150 pixel image with a 2.5 MB physical size. Wait a minute, how can the image be 150 x 150 but have a 2.5 MB physical size?

The answer lies in the fact that it was not physically compressed. When you physically compress an image, you not only reduce the image dimensions, you also reduce the physical dimensions. Adobe Photoshop performs what’s called “lossless compression,” a type of compression that leaves the final optimized file pretty much exactly as you found it.

The ideal size range to target for optimized images within content is around 15-50KB depending on pixel dimensions. Obviously, a 700 x 700 photo is going to be much larger than a 150 x 150 photo, so it is best to use your best judgment based on your audience’s connection speeds.

However, just taking a saved image and resizing it in a CMS like WordPress will not work. Why? Because WordPress only resizes the physical dimensions. It does not resize the physical + pixel dimensions at the same time.

This is why a two-part process is required: 1. Take the image and physically resize its pixel dimensions in Photoshop, 2. THEN add it to WordPress. Of course, step 2 is eliminated if you’re hand-coding, because all you have to do is code the width + height into the image.

This brings us to our next point: Always make sure your images are coded with the width and height. Why? Because otherwise, the browser has to guess the size of the image. It adds an extra step to the rendering process, which thereby adds precious milliseconds to load time. Are you impressed yet? No? Let’s move forward, then…

Code the right way by thinking “minification from the start”

Creating a site that has thousands upon thousands of lines of code is all fine and dandy. But if those lines of code become redundant, they become liabilities to your site’s load time, sometimes increasing it tenfold if you don’t pay that much attention to it. This is why a “think minification” approach is one of the best approaches to attaining coding nirvana.

How many divs do you really want to use in your content? How many tables? (I hope you are not still using tables for design. It’s an antiquated method, and the W3C states that tables should only be used for tabular data, not for layout reasons.) Do I really want to slice this image up into four slices? Or, would it be better to use one image and optimize it to its core? (This is a decision that will depend on the size of the image.)

Here is an example that takes an extreme coding SNAFU situation and turns it into a beautiful thing. Look at the sample page code below. You’ll notice there’s a lot of inline CSS that is causing code bloat, and likely some issues with some browsers being confused about what the CSS wants to have happen.

Code Example 1

By condensing this coding into its minimal form and using CSS to achieve the absolute minimalist markup we can, it is possible to decrease page load time via minification. By observing proper planning and execution, our load time can be ever so slightly minimized above and beyond the call of duty (which is exactly the result we want):

Code Example 2

[Click to enlarge]

On an existing website, minify Javascript, CSS and all code bloat

If your site has more than two or three JavaScript files, this counts as over-implementation of JavaScript. The reason this matters is that the more calls to the server your on-page elements make, the more bottlenecks you introduce into your site speed.

The same goes for many CSS files at once. When you add 10 JavaScript files on top of 10 CSS files, it can cause some major speed bottleneck issues.

The general recommendation is to make sure that your server handles no more than two or three JavaScript files and/or two or three CSS files per page in order to keep your server calls in check.

You don’t have to always use minification plug-ins to minimize the impact that multiple JavaScript files have. You should manually perform minification on all of the offending files. The reason for this is that minification plug-ins don’t always perform the proper optimization. In fact, some plug-ins can add even more code bloat.

When in doubt, always go the manual route.

Strive for less than one second load time across all connections and devices

The following quote is from “How Loading Time Affects Your Bottom Line” on the Kissmetrics blog:

Load time is a major contributing factor to page abandonment. The average user has no patience for a page that takes too long to load, and justifiably so.

It is imperative to strive for less than a one-second load time across all devices for every page of your site. Now, shaving two or three seconds off your load time may not sound like much. However, it really can mean the difference between a successful site and a haphazard site.

What is the reasoning behind this seemingly impossible metric of one-second load time? According to Kissmetrics, “A 1-second delay in page response can result in a 7% reduction in conversions.”

This means that “if an e-commerce site is making $100,000 per day, a 1-second page delay could potentially cost you $2.5 million in lost sales every year.”

That is a heavy price to pay to continue operating a site that has a 7- to 8-second load time. So please, make your site load in one second or less. Your visitors (and Google) will thank you.

Note: Is this always realistic? No. Budget, priorities and other things like project scope will need to be taken into account as you make your decision on this. It is important to use your own discretion and best judgment when deciding whether or not this will be a good move for your project.

Special considerations for mobile

Google’s Guidelines for mobile are a good place to start when it comes to making sure your site is an optimized utopian user experience. But what do you do when you want to consider page speed optimizations for mobile?

First off, depending on the type of site you are working on (informational, e-commerce or something else), you will want to keep the complexity simple. Don’t use JavaScript and overly complex dynamic server-side executions to present your site (unless it is lightweight and works for your site).

Keep things simple with a single style sheet using multiple media queries with strategically optimized images. What do I mean by strategically optimized images?

Here is one example: if you use a header image, create the header image in such a way that you can dynamically resize it through the media query by using the same header image. Don’t use multiple images for multiple media queries. All that does is increase calls to the server and create a bandwidth bottleneck that can be challenging to optimize after the fact. Remember our JavaScript example? Keep calls to the server at a minimum.

Next, make sure your images are also quality-optimized for mobile. Take load time into consideration first when optimizing, and then consider quantities of images. Focus on the minimization of both in your quest for a fast-loading mobile website.

These are by no means the only things you can do, but they will help

By following these recommendations, it is possible to increase site performance tenfold. Looking at minification, Google Analytics and overall site speed issues and ensuring their speedy resolution will help add to that performance.

The post What your teacher didn’t tell you about optimizing site speed appeared first on Search Engine Land.

]]>
How To Compile A Top-Notch Competitive Analysis For Search /compile-top-notch-competitor-analysis-242098 Tue, 16 Feb 2016 15:48:13 +0000 http:/?p=242098 In order to beat the competition in search engine rankings, you must first know what they're doing. Columnist Brian Harnish shares his tips for putting together a solid competitor analysis.

The post How To Compile A Top-Notch Competitive Analysis For Search appeared first on Search Engine Land.

]]>
research-competition-magnifying-glass-ss-1920

When it comes to search engine optimization (SEO), it’s crucial to research your competition. Knowing where you stand in relation to your competitors will help inform the strategy and tactics needed to achieve your client’s SEO goals, allowing you to focus your efforts and set realistic expectations.

Who are the top competitors? What is driving their results that you are not doing? When do they publish content? Where do they share their content? Why do they do what they do? And how do they do it? All of these questions should be answered thoroughly, because their answers will be a driving force in your optimization strategy.

Unfortunately, putting together a competitor analysis can be a tricky and time-consuming endeavor. But if you follow the steps below, analyzing your competitors just got a little easier.

1. Always Find Out What The Client Wants

The true measurement of any competitor analysis’ success is how well it helps the client achieve their primary objective(s). Whether it’s obtaining new leads, increasing brand perception, driving traffic for dollar conversions or something else, one thing is certain: success must be measurable.

Accurate reporting of clearly established success metrics is essential in communicating to the client that their goals and objectives have actually been achieved. Errors in interpretation of data from any campaign can turn to failure what otherwise would be a great success.

This is why it is important to find out the client’s main objective in the beginning and always make sure to keep it top of mind during the analysis.

2. Determining Your Competitors

When you create a competitor analysis, you are taking a magnifying glass to what is driving rankings for websites that appear above your client’s in search results for your target keywords. Thus, you’ll want to find out what your client wants to rank for and build your competitor analysis around that.

Note: My advice is to limit this list of target keywords to 10 or fewer. While everyone would like to rank for many keyword phrases on Google, timely completion of a competitor analysis can mean the difference between success and failure — so you have to have a limit in place.

Your client may need you to advise on which keywords are worth targeting, and you should be prepared to provide some recommendations. Of course, choosing target keywords has gotten a bit more complex in recent years.

In the early days of SEO, you wrote a bunch of content focused on a keyword phrase, and that would be it. Now, it’s more important to choose topics that are of interest to your audience and build your keyword strategy around that.

Decide which topic or topics are relevant to your client, then perform your keyword research around those topics. Find the keywords related to your topic that have good search volume and are not impossibly competitive. (If you have a massive budget and can afford to get your site into the top echelon of those competitive keyword phrases, then sure — go after them! But don’t expect it to be easy or quick.)

Once you have developed your keyword list, you can then identify the top 10 competitors for each. Perform a search for each target keyword, and note the websites that appear on the first page of search results.

When we identify these sites, we are only creating an initial list of competitors. The list can later be narrowed down to include just the biggest “threats,” or it can be edited to include specific competitors if the client requests them.

Once we’ve identified our client’s main competitors, we can begin our analysis.

3. Find Out How Many Indexed Pages The Competition Has

The number of indexed pages is an important metric to pinpoint. Why? Because the more indexed pages a website has, the more Google is — generally — going to crawl it (especially if the site is updated regularly).

To find out how many pages of a website are indexed in Google or Bing, you can perform a “site:” search from within the search box. So, for example, you would type “site:domain.com” into the search box and see how many results came up.

Keep in mind that indexed pages impact crawling, not rankings. Sheer quantity of content, just like sheer quantity of links, is never a guarantee of high rankings. You can make 300 content updates a month and still only get 100,000 visitors a year; conversely, you can make 15 content updates a month and drive more than 500,000 visitors a year. It all depends on how good your SEO is.

In terms of a competitor analysis, a high number of indexed pages doesn’t always mean that a site is impossible to beat. It doesn’t always equal better site authority or better rankings. (Matt Cutts, Google’s former head of web spam, talks about this here: More Pages Does Not Equal Higher Rankings.) However, these can certainly be correlated.

Discovering how many indexed pages a competitor’s site has will give you an idea of what you’re up against in terms of content to compete with. And that brings us to our next section.

4. Look At What Your Competitors Are Writing About

By looking at your competition’s content, you can find out what your audience is willing to read, how much they are willing to read, what they are willing to share, when they read it, when they are willing to share and why they shared the content they shared. You can also get a general word count of their top-ranking articles to get an idea of what performs best in your industry.

It is a good idea to observe the content that’s out there and work to create what Moz’s Rand Fishkin calls “10x content” to beat them. Fishkin defines 10x as content that is “10 times better than anything I can find in the search results today. If I don’t think I can do that, then I’m not going to try and rank for those keywords.”

Looking at your competitions’ content, you can answer the most important question your client will have: What is my competitor writing that is performing well virally, socially and organically?

  • Virally: Look at content that has gained the most reads, shares and overall positive feedback across your client’s industry. This will give you a clue to the kind of content that will perform well. Remember, though, that just because one idea goes viral, it doesn’t always mean that it will go viral again. What we are looking for here is resonance: how that content speaks to readers and how they react to it. Once we find out what content resonates with your audience, we can write content that may garner a similar reaction.
  • Socially: Using a tool called BuzzSumo, it is possible to find out the social reach of a piece of content. Looking at BuzzSumo for keywords and domain names, one can find out exactly the share potential of a piece of content. If you’re looking to write something, find out how the topic has done in the past across all the major social networks by using this tool.
  • Organically: Looking at search engine results pages (SERPs) for a given keyword, you can figure out how well content has done organically. Finding out how well content has done organically is important because you want to find out how Google views that kind of content. However, keep in mind that just because one site’s content is ranking well in SERPs, it doesn’t mean that your similar content will perform the same way. We are simply assessing things like word count, who wrote the content, what was written that performed well, when it was written, why it was written and how it was written. This way, you can create 10x content that can beat them.

5. Examine Your Competitors’ Link Profiles

How many inbound links do your competitors have? And more importantly, where are they coming from?

Just like content, sheer link quantity is not the way to build a solid link profile. If a competitor has 300,000 links, what are those links doing? Do they look like this?

  • 50,000 links from low-quality directories.
  • 20,000 links from article marketing sites.
  • 100,000 links from gambling sites with high PageRank.
  • 100,000 links from porn sites with high PageRank.
  • 30,000 links from no-follow blog comment spam.

Or do they look more like this?

  • 50,000 links from .EDU sites.
  • 50,000 links from .GOV sites.
  • 50,000 links from niche authority sites.
  • 50,000 links from press releases.
  • 50,000 links from miscellaneous sites.
  • 50,000 links from niche blogs that aren’t comment spam.

You can have 300,000 links and nonetheless have an awful link profile, as in the first example. Low-quality directories, article marketing sites, gambling sites, porn sites and no-follow blog comment spam are bad sites to have.

Links like these are considered “link schemes,” and they have the potential to result in a penalty that decreases search engine visibility. Such links are often artifacts of SEO tactics from years gone by, when Google’s spam detection was less sophisticated.

The main takeaway here is that if you want to emulate a competitor’s link-building strategy, make sure that you aren’t doing anything so egregiously bad that could cause your site to fall under a penalty.

Examine the competitor’s link profile for healthy link ratios, sites that are not spam and solid Google-friendly link-building strategy. If the competition is using bad link profiles to get their rankings, you want to differentiate your site from theirs by following a healthier path.

Don’t always assume that the competition is doing things the right way. This is how most inexperienced SEOs get themselves into trouble.

Putting It All Together

By examining your competitors and improving upon their strengths (while avoiding their weaknesses), it is possible to formulate a strategy that will help propel your next project to #1 in the SERPs.

Don’t ever be afraid to dig deep into the rabbit hole. You may be surprised at what you find.

The post How To Compile A Top-Notch Competitive Analysis For Search appeared first on Search Engine Land.

]]>