White Label SEO
seo in dubai Stephan Spencer – Search Engine Land News On Search Engines, Search Engine Optimization (SEO) & Search Engine Marketing (SEM) Mon, 08 Apr 2019 20:36:12 +0000 en-US hourly 1 There’s no shortcut to authority: Why you need to take E-A-T seriously /theres-no-shortcut-to-authority-why-you-need-to-take-e-a-t-seriously-315102 Mon, 08 Apr 2019 17:57:31 +0000 /?p=315102 Following expertise, authoritativeness and trustworthiness guidelines should be a part of every SEO strategy no matter your niche.

The post There’s no shortcut to authority: Why you need to take E-A-T seriously appeared first on Search Engine Land.

Recently, I was talking to a client about a new project when he raised an interesting question. He was curious to know how I measured E-A-T, or “expertise, authoritativeness and trustworthiness,” in relation to SEO.

If you’re unfamiliar with E-A-T, it’s a term taken from Google’s Quality Rater Guidelines, a set of instructions that Google’s army of many thousands of human reviewers (known internally as “raters” or “Search Quality Evaluators”) use to assess the quality of web content manually.

Although it is sometimes difficult to understand Google’s internal processes, from what I’ve heard from reliable sources at Google, E-A-T is applied specifically to YMYL (Your Money or Your Life) websites, that is, to sites that offer medical or financial advice.

As you might expect, Google would rather not serve up misleading or unreliable advice that could affect your financial or physical well-being, so paying particular attention to the information touted on these types of websites is very important.

Now, my client’s website was not in a financial or medical niche, so technically these guidelines do not affect him directly. After all, Google – so they say – are not applying E-A-T across the board. But I didn’t tell the client not to worry about E-A-T.

Far from it.

In my opinion, expertise, authoritativeness, and trustworthiness are things that every business should be looking to build both online and offline. What business wouldn’t want to be recognized and trusted within (and beyond) their industry?

I asked my co-author on The Art of SEO, Eric Enge, to weigh in on this issue of E-A-T and whether it’s applied to YMYL only or more globally across the Web. He responded “Google is very technical and precise in how they use various terms. Within Google, as I understand it, E-A-T refers to something that they apply specifically to YMYL sites. But that doesn’t mean that the general ideas that we all associate with E-A-T aren’t likewise applied to other sites.”

In this excellent article, Chris Silver Smith argues that Google partly uses a numerical score to calculate E-A-T. When I emailed him, he replied that “If Quality (the combined E-A-T) is partly a numerical score as I’ve long theorized, then that factor is weighted much heavier for YMYL pages/sites than for things like entertainment pages, or articles about non-YMYL topics, etc. But, E-A-T still applies to things like e-commerce pages, even when those are not as high-priority as YMYL.”

So, if you believe me and these two highly regarded SEO practitioners, following E-A-T guidelines is good for SEO no matter your niche. In my view, even if E-A-T only applied to certain industries, the techniques used to build authority should be a part of every SEO strategy.

The problem is, E-A-T is notoriously difficult to measure.

How does Google measure authority?

If you talk to different people in the SEO industry, they will have different theories about the signals that Google uses to assess the authority of your site and assign rankings. We know that backlinks from authoritative sites are one way. CTR (Click Through Rate) is theorized as another, although Gary Illyes of Google contradicted that recently in his Reddit AMA. We also know that content quality is important. Online reviews may also have some impact.

Exactly how Google uses all these factors to make a decision is somewhat of a mystery, even to Google engineers. That’s because machine learning algorithms are opaque as to which signals they use. No one can see inside the black box – even the programmers who originally coded the AI.

In the aforementioned article, Chris Silver Smith argues that, rather than weighing one particular signal above all others, Google’s approach to assessing authority is more “holistic.” Google’s algorithms almost certainly use a wide range of signals and metrics to evaluate where a page might rank, meaning simply focusing on one signal while ignoring others is not a shortcut to results. Backlinks are crucial; but acquiring high-quality links to a site with shoddy coding, poor online reviews and spammy content won’t work.

Instead of looking for a “silver bullet” to achieve rankings and traffic, it’s important to pay careful attention to how you present your brand online. Overall, from the way your site is coded right up to your branding and PR strategies.

Holistic SEO

I’ll admit, saying that Google’s approach to rankings is “holistic” may sound a little vague and unsatisfying. It begs the question: what do you focus on if you want to optimize your site?

Thankfully, it’s not that difficult.

For the most part, building authority in your niche is common sense. If you’ve been working hard on gaining backlinks from quality sites, creating remarkable content, and ensuring your site is free from errors and thin content, then you’re well on your way.

But what can you do beyond the basics to ensure that Google sees you as a trusted site in your niche?

The answer is simple, but not easy. That is, do whatever it takes to ensure you have a solid reputation both online and offline.

On your site, be completely transparent about who you are. Create detailed “about” pages that put a human face to your company and tell your story (see mine as an example of such). Provide an excellent customer experience and respond to negative reviews online. Be open and honest about your processes and provide expert advice to your clients whenever you get the chance.

Link building is still a cornerstone of SEO; but gone are the days when you can simply spam people with generic emails offering content for guest posts. Instead, aim high and focus on quality over quantity. For example, I recently published an article in Harvard Business Review. Since this is a prestigious outlet that is very discerning about who it publishes, the link is incredibly powerful in the eyes of Google.

Building authority online takes time, but the payoff is huge. Start with the obvious questions. Is there an expert at your company who might be willing to do a TEDx talk? What’s the most respected publication in your industry, and how can you get published there? What about industry groups? What kind of connections do you have in the media that might be able to help you? Do you do any noteworthy charity/nonprofit work that has a powerful message that might be of interest to journalists?

If you’re not sure where to start, hire a PR agency or better yet, buy a book on PR and teach yourself.

Trust metrics

Fine, you might say. All this is good stuff, but (getting back to my client’s question) how do I measure the impact of this kind of work?

For one thing, a successful PR/link building campaign that lands you links from high authority sites will definitely begin to impact your traffic and rankings.

If you’re looking for more quantifiable metrics, then consider investing in tools like Majestic and LinkResearchTools. I find that Majestic’s “Trust Flow” and LinkResearchTools’ “LRT Trust” metrics still give the best indication of how trusted a particular page is.

Both these scores are based on your link profile. Although this is just one aspect of all the elements Google takes into account, it’s still the best indication that we in the SEO community have available to us on how much trust you are endowed with. In Majestic, I recommend aiming for a Trust Flow score of 50 or above (the highest score is 100) and an LRT Trust score of 5 or above (the highest score is 10). Make sure that you are looking at these metrics at the page level, not just the domain level.

As both of these scores are relatively high-level, it’s not possible to measure the incremental changes in E-A-T my client was interested in, as in every link he acquired and every piece of content he updated. Still, tracking your trust scores over time will give you a sense of whether your site is increasing or decreasing in trust. Some of the tools even provide a history of trust scores; Majestic, for example, goes back 18 months with their Trust Flow History Tool.

In addition, it is worthwhile to conduct regular surveys of your customers and reviewing brand sentiment metrics to get a sense of how people view your brand. If you see a lot of negative sentiment, you’ll want to take action to remedy it quickly.

As with most things SEO-related, it won’t be a single link or a piece of viral content that suddenly launches you into the pole position. It takes a sincere and sustained effort over months, even years, to get to the top.

The post There’s no shortcut to authority: Why you need to take E-A-T seriously appeared first on Search Engine Land.

5 ways ignoring SEO could affect your bottom line /5-ways-ignoring-seo-could-affect-your-bottom-line-306780 Thu, 18 Oct 2018 16:35:00 +0000 /?p=306780 It may be possible to run a business in 2018 without doing search engine optimization, but doing so exposes you to risks and leaves money on the table.

The post 5 ways ignoring SEO could affect your bottom line appeared first on Search Engine Land.


Anyone who has tried to make the case for investing in Search Engine Optimization to a client, boss, or colleague, will be all too familiar with the common objections: it’s too unpredictable, PPC is better, it takes too long, etc.

There’s a common misconception that the financial benefits of SEO are not as clear as say, social media or PPC. But anyone familiar with SEO knows that it is highly measurable and in most cases even better value for money than both social media and PPC.

And, many businesses seem to have realized this: In 2016, Borrell and Associates predicted that the SEO industry would reach $80 billion a year in revenue by 2020.

A well thought-out SEO strategy will bring more qualified traffic to your website. Quality content tailored to the needs of your customers will bring a higher conversion rate. These things are well known. But ignoring SEO altogether doesn’t just mean losing a few sales here and there: it can be risky or potentially disastrous financially.

Here are five ways that it could affect your bottom line…

1. Using PPC as a replacement for SEO

Many people have attempted to settle the SEO vs. PPC debate by trying to calculate a definitive conversion rate for both. While many of these studies provide valuable insight into both paid and organic traffic, there are a number of variables that cannot be captured by a simple percentage.

Take, for example, the fact that a site optimized for SEO might target keywords for people at different stages of the sales funnel, from educational articles through to product pages. PPC campaigns, on the other hand, tend to send leads directly to a sales page.

Many of the people who convert via PPC may have been primed by content they accessed organically: according to Ipsos, 44% of online shoppers begin by using a search engine. The same could also be true of phone or brick and mortar customers: according to Acquisio, there will be 73 million phone calls generated by mobile search alone by the end of 2018.

PPC often has a higher conversion rate than SEO, but it casts a narrow net. SEO allows you to broaden that net and reach potential customers at the very start of their purchase research. Without it, you’re leaving leads and sales on the table and missing an opportunity to build your brand’s authority.

2. Missing out on, or misunderstanding, lucrative niches

You might think that optimizing for a few high-traffic terms in your niche is enough. But one of the most valuable parts of SEO is gaining access to the thought processes of your customers.

Recently, a colleague of mine optimized a site for an RV dealership in Oregon. Initially, the dealership wanted to create a page for people searching for RVs in Portland. Some basic keyword research revealed that there was little to no traffic around Portland-specific terms. Instead, the high-traffic terms were all state-based.

It seems like a minor distinction, but without that small but valuable insight, the company would have been missing out on a potentially huge pot of revenue.

3. Being unprepared for Google updates

After Google updated its core algorithm in August 2018  (known as the “Medic” update) there were widespread reports of devastating traffic losses, particularly in the health and wellness sector. Some webmasters even claimed that the update had destroyed their businesses.

While it’s impossible to predict exactly how updates in Google’s algorithms will play out in the rankings, adhering to SEO best practices can mitigate the risk of being adversely impacted by an update.

For example, once the smoke had cleared from the initial damage of the Medic update, it was obvious that the sites that had been impacted the most were lacking in “E-A-T”, e.g. they were lacking expertise, authoritativeness, and trustworthiness. By focusing on creating quality content and building trust and authority with their audience, sites with a robust SEO strategy in place had minimized their risk of a traffic and sales drop.

4. Ill-advised site redesigns

It may be shocking in 2018, but many businesses still employ agencies or developers with little or no knowledge of SEO to redesign their sites.

I can’t tell you how many times I’ve been called in at the last minute on a redesign only to find that a company has gone ahead and built their site completely in JavaScript, or made some serious technical error that will almost certainly result in a Google demotion. Then, they expect me to sprinkle a few keywords into their content and get them rankings immediately!

Going ahead with a redesign without bringing on an experienced SEO consultant at the beginning of the process is simply asking Google to take away your rankings — and your online income streams!

5. Lack of credibility and trust

While there is obviously a direct relationship between organic traffic and sales, if you ignore SEO you will also be missing out on many other benefits. For example, it is well known that organic listings have far more credibility with consumers than paid listings. If you dominate the first page of Google across your industry, it’s great PR and will reinforce the perception that your company has authority and expertise.

If you ignore SEO and your site appears rarely or not at all in organic listings, it may make consumers more suspicious of your credentials and even affect the conversion rate of your PPC ads.


It’s certainly possible to run a business without an SEO strategy, and you may even achieve success. However, by ignoring SEO, you’re holding yourself back from even greater sales and success by leaving opportunities sitting on the table for your competitors to snatch up.

A solid SEO strategy allows you to minimize the risks of a drop in sales, while also maximizing your online footprint and leveraging the most lucrative opportunities to your advantage.

Believe me, your bottom line will thank you!

The post 5 ways ignoring SEO could affect your bottom line appeared first on Search Engine Land.

Flying close to the sun: SEO tactics that may get you burned /flying-close-to-the-sun-seo-tactics-that-may-get-you-burned-304233 Wed, 22 Aug 2018 17:23:00 +0000 /?p=304233 The temptation to take the "quick and easy" route is everywhere, and SEO is no different. Contributor Stephan Spencer shows how going black- or gray-hat might sound good initially, but in the end, like Icarus, may get you burned.

The post Flying close to the sun: SEO tactics that may get you burned appeared first on Search Engine Land.


What kind of risk are you willing to take for better rankings and more organic traffic?

For many years now, there has been an ongoing debate in the search engine optimization (SEO) world about whether “black-hat” or “gray-hat” tactics — that is, techniques that attempt to achieve quicker results by flouting the search engines’ guidelines — are acceptable.

While many commentators take a moralistic tone around this issue, I prefer to look at it in terms of risk. If you are willing to risk a Google penalty for the possible payoff of quicker or better rankings, then go for it! Just don’t be surprised when Google gets wise to what you’re doing and your traffic takes a nose dive! Doesn’t matter if it’s months or years later; expect to pay the piper.

Steering clear

Personally, as someone who works with a lot of large corporations with much at stake, I steer well clear of black-hat and gray-hat techniques.

For anyone working on a domain they don’t want to go down in flames, there’s simply no way to justify gambling with a site’s authority and reputation in such a reckless manner. In the SEO world, there are plenty of people willing to take the risk. Many SEOs I know make the point that what is considered gray-hat and black-hat may be subjective, depending on the industry you are operating in.

While many SEO practitioners have years of experience in this field, Google’s algorithms get smarter by the day, and it’s becoming harder and harder for even the best SEOs to outsmart Google. That’s especially true for gray- or black-hat newbies. It’s fair to assume that many of Google’s algorithms learn and evolve autonomously. In such a world, gray-hat SEO techniques have become far riskier, not worth the effort.

So what are the risks and benefits (if any) of employing gray-hat and black-hat techniques in this day and age? For the record, Bing and Google are very, very clear on what goes against their policies. Here is a rundown of the most common gray/black-hat tactics and my insights on each.

Private blog networks

Private blog networks (PBNs) arose as a shortcut to building authority. The premise is simple: buy a bunch of expired domains with good domain authority and create a link back to your site.

Bingo! Instant backlinks and rankings, right?

Not really. These days, building a PBN takes a lot of effort and some sneaky tricks to avoid detection. For example, you’re going to want to look into the domain’s history and ensure it’s squeaky-clean, make sure it has never had a penalty or multiple owners. Sites that have been bought and sold many times are a red flag to Google. You’ll also want to make sure your sites are hosted by different companies and have different internet protocol (IP) addresses.

Pros: Using PBNs means you have full control over your link building and can save time and money on link outreach.

Cons: If just one of the sites in your network gets hit with a penalty, it can quickly be passed on to any site you’re linking to. You can torch your entire network with one slip-up.

My Take: I wouldn’t touch PBNs with a 10-foot pole. Period.

Spun, scraped or keyword-stuffed content

Creating good content is time-consuming and expensive, so it’s no surprise that people look for shortcuts.

While most duplicate content issues probably result more from technical misconfigurations than intentional deception, there are still plenty of people out there who think they can game the system by scraping other people’s content or “spinning articles” to create dozens of variations on the same article, oftentimes injecting extra keywords for good measure (i.e., keyword stuffing).

For those of us who actually prefer informative, readable content, it’s a blessing that these spammy content tricks don’t work anymore.

Google’s algorithms now have a much more sophisticated grasp on grammar and natural language. As such, Google can spot these tricks a mile away, so expect to get slapped with a penalty if you try them. My advice: Just write something people will want to read.

Pros: Quickly and easily create new content. At scale!

Cons: Destroy credibility with your users and search engines as soon as you publish.

My Take: Do you seriously have to ask?

Negative SEO

It’s a cutthroat world out there, and sometimes people sabotage their competitors’ websites.

This is referred to as “negative SEO” and generally involves pointing spammy links at a site, buying links on behalf of the competitor, scraping their content and duplicating it across multiple sites, trying to crash the site by driving too much bot traffic to it (i.e., DDOS attack), or even hacking into a site to insert malware or modify the content.

Not only are some of these tactics, like hacking, illegal, but Google is also getting better at detecting and ignoring things like spam links. In short, negative SEO is a huge risk, and there’s no guarantee it will even work.

One particularly nefarious negative SEO tactic is to pose as the competitor and launch a link removal campaign. That’s right, some link removal requests you receive as a webmaster are from forged senders.

Pros: None that are worth having.

Cons: You could wind up banned from the search results or in jail.

My Take: JUST DON’T. You will sleep better at night.

Paid links

Soliciting links from other websites to improve your ranking or boost traffic to your web pages is a vital part of SEO. It’s also time-consuming, frustrating and boring. But if you think buying links is the answer, think again. It’s against Google and Bing’s guidelines, so if either engine catches you, you can wind up with a penalty and have your rankings wiped out.

In addition, buying links is quite expensive. One Ahrefs study found that the average cost of buying a link is between $350 and $600. If you spent that money on a legit content marketing campaign, there’s no reason why you couldn’t achieve multiple links of higher quality for the same price.

Pros: Easier than traditional link building.

Cons: You’ll achieve a better outcome using white-hat link-building techniques, and you’ll probably spend less in the long run.

My Take: Buying links leaves an obvious footprint, and you’ll regret it when you have to launch a link-removal campaign to undo everything you built and then implement a slew of reconsideration requests.


Cloaking refers to the practice of creating one type of content to display to a search engine spider (say, a page full of keyword-rich copy) while showing another type of content to the user (for example, an image-heavy page with sales copy).

As you might have guessed, sites that cloak pages are generally trying to hide something from search engine spiders while attempting to manipulate their Google rankings.

Cloaking is a high-risk technique that will get you penalized or even blacklisted by most search engines. On top of that, it’s easy for Google to catch you simply by using unpredictable IPs and user agents. Search engines analyze a number of elements on your site in order to determine what you should rank for, not just the content.

If you’re cloaking your content, it won’t take long for Google to figure it out.

Pros: None.

Cons: Your site will be penalized or even blacklisted by Google.

My Take: Danger, Will Robinson!


Using gray- and black-hat techniques requires a lot of sneaking around to obscure your online footprint and avoid being penalized. This seems fairly basic, but you’d be surprised at the people who disagree.

Some SEOs feel it’s up to you to decide if the additional effort and risk is worth it. If yes, they feel your goal should be to make your site’s “story” believable to the search engines and use gray-hat techniques to kick-start your SEO, then transition to more conventional white-hat techniques to gradually eliminate the risk over time.

My position on all this? Sticking to white-hat techniques from the beginning is actually less effort and involves less risk, with higher returns in the long run. As a bonus, you won’t be losing sleep at night worrying about the day that Google will finally come knocking on your door.

Fly well, my friends!

For webmasters affected by a manual action, it’s important to understand why a particular penalty is applied, the consequences and how to adequately address key issues. Check out our Google Manual Actions: Frequently asked questions and their answers for help and insights.

The post Flying close to the sun: SEO tactics that may get you burned appeared first on Search Engine Land.

The ultimate guide to bot herding and spider wrangling — Part 3 /the-ultimate-guide-to-bot-herding-and-spider-wrangling-part-3-301146 Thu, 28 Jun 2018 14:00:00 +0000 /?p=301146 In this third and final installment, contributor Stephan Spencer outlines common coding, mobile and localization issues and offers workarounds to make sure your code provides consistent cues.

The post The ultimate guide to bot herding and spider wrangling — Part 3 appeared first on Search Engine Land.

In parts one and two of this series, we learned what bots are and why crawl budgets are important. In the third and final segment, we’ll review common coding, mobile and localization issues bots may encounter on their journey to let the search engines know what’s important on your site.

Common coding issues

Good, clean code is important if you want organic rankings. Unfortunately, small mistakes can confuse crawlers and lead to serious handicaps in search results.

Here are a few basic ones to look out for:

1. Infinite spaces (also known as spider traps). Poor coding can sometimes unintentionally result in “infinite spaces” or “spider traps.”

Some issues can cause the spider to get stuck in a loop that can quickly exhaust your crawl budget. These include endless uniform resource locators (URLs) pointing to the same content; pages with the same information presented in a number of ways (e.g., dozens of ways to sort a list of products); or calendars that contain an infinity of different dates.

Mistakenly serving up a 200 status code in your hypertext transfer protocol (HTTP) header of 404 error pages is another way to present to bots a website that has no finite boundaries. Relying on Googlebot to correctly determine all the “soft 404s” is a dangerous game to play with your crawl budget.

When a bot hits large amounts of thin or duplicate content, it will eventually give up, which can mean it never gets to your best content, and you wind up with a stack of useless pages in the index.

Finding spider traps can sometimes be difficult, but using the aforementioned log analyzers or a third-party crawler like Deep Crawl is a good place to start.

What you’re looking for are bot visits that shouldn’t be happening, URLs that shouldn’t exist or substrings that don’t make any sense. Another clue may be URLs with infinitely repeating elements, like:…

2. Embedded content. If you want your site crawled effectively, it’s best to keep things simple. Bots often have trouble with Javascript, frames, Flash and asynchronous JavaScript and XML (AJAX).

Even though Google is getting better at crawling formats like Javascript and AJAX, it’s safest to stick to old-fashioned hypertext markup language (HTML) where you can.

One common example of this is sites that use infinite scroll. While it might improve your usability, it can make it difficult for search engines to properly crawl and index your content. Ensure that each of your article or product pages has a unique URL and is connected via a traditional linking structure, even if it is presented in a scrolling format.

Mobile sites

Google’s announcement of mobile-first indexing in November 2016 sent shockwaves through the search engine optimization (SEO) community. It’s not really surprising when you think about it, since the majority of searches are conducted from mobile devices, and mobile is the future of computing. Google is squarely focused on the mobile versions of pages rather than the desktop versions when it comes to analysis and ranking. This means that bots are looking at your mobile pages before they look at your desktop pages.

1. Optimize for mobile users first. Gone are the days when a mobile site could be a simplified version of your desktop site. Instead, start by considering the mobile user (and search engine bots) first, and work backward.

2. Mobile/desktop consistency. Although most mobile sites are now responsive, if you have a separate mobile version of your site, ensure that it has the same internal linking structure, and link bi-directionally between the two sites using rel=alternate and rel=canonical link elements.

Point to the desktop version from the mobile site using rel=canonical and point to the mobile site from the desktop site with rel=alternate. Note that this is an interim solution until you move to responsive design, which is the preferred approach, according to Google.

The Mobile First Crawl Will Now Be Top Priority

3. Accelerated mobile pages. Accelerated mobile pages (AMP) are one of Google’s more controversial inventions, and many webmasters are still hesitant to use them, since it means letting Google host a cached version of your pages on their own domain.

Google’s rationale is that accelerated mobile pages allow them to serve content up more quickly to users, which is vitally important with mobile. While it’s not clear whether Google actually prioritizes accelerated mobile pages over other types of mobile pages in search results, the faster load time could contribute to a higher ranking.

Point to the AMP version of a page using rel=amphtml and point back to the canonical URL from the AMP page using rel=canonical. Note that even though accelerated mobile pages are hosted on a Google URL, they still use up your crawl budget.

Should you block bad bots?

Unfortunately, it’s not only search engines that use bots. They come in all shapes and sizes… and intentions, including those designed to hack, spy, spam and generally do nasty stuff to your website.

Unlike friendly search engine bots, these spiders are more likely to ignore all your instructions and go straight for the jugular. There are still some hacks you can use to keep bad bots out. Be warned, these hacks can be time-consuming, so it might be worth consulting your hosting company on their security solutions if you’re really struggling.

1. Using htaccess to block internet protocol (IP) addresses. Blocking bad bots can be as simple as adding a “deny” rule to your htaccess file for each bot you want to block. The tricky part here, of course, is actually figuring out what IP address the bot is using.

Some bots might even use several different IPs, meaning you need to block a range of addresses. You also want to make sure you don’t block legitimate IP addresses. Unless you got a list of known IPs to block from a trusted source or you know which page the bot accessed, along with the approximate time or geographical location of the server, you’re likely to spend hours searching through your log files.

2. Using htaccess to block user agent strings. Another option is to set up a “deny” rule for a specific user agent string. Again, you’ll need a list from a trusted source, or you’ll be sorting through your log files to identify a particular bot, and then add the information to your htaccess file.


Since bots need to understand what country/regional version of a search engine you want your pages to appear in, you need to make sure your code and content provide consistent cues about where your sites should be indexed.

1. Hreflang. The hreflang tag (which is actually a type of rel=alternate link element) tells bots what language and region your page is targeting (e.g., en-ca or en-au).

This sounds simple enough, but it can cause a number of headaches. If you have two versions of the same page in different languages, you will need to provide one hreflang tag for each. Those two hreflang tags will need to be included in both pages. If you mess this up, your language targeting could be considered invalid, and your pages might trip the duplicate content filter or not be indexed in the right country version of Google.

2. Local spellings. While hreflang tags are important, bots are also looking for other clues that guide them on how they should index your site. One thing to be careful of is local spellings. If your page is targeted at a US audience, yet you use UK spellings, it could result in being listed in the wrong country version of Google.

3. Top-level domains, subdomains or subdirectories for different locations. If you want to make it even clearer to bots that your content is targeted to a specific region, you can use country code top-level domains (ccTLDs), subdomains or subdirectories. For example, the following are various ways to indicate content targeted at Canadian users:




While many website owners and even some SEOs may think they can wing it with good content and quality backlinks alone, I want to emphasize that many of these small tweaks can have a significant impact on your rankings.

If your site’s not crawled — or crawled badly — your rankings, traffic and sales will ultimately suffer.

The post The ultimate guide to bot herding and spider wrangling — Part 3 appeared first on Search Engine Land.

The ultimate guide to bot herding and spider wrangling — Part Two /the-ultimate-guide-to-bot-herding-and-spider-wrangling-part-two-296290 Wed, 02 May 2018 15:17:00 +0000 /?p=296290 Next up in a series on bots and why crawl budgets are important, Columnist Stephan Spencer explains how to direct the engine bots to what's important on your site and how to avoid common coding issues.

The post The ultimate guide to bot herding and spider wrangling — Part Two appeared first on Search Engine Land.

In Part One of our three-part series, we learned what bots are and why crawl budgets are important. Let’s take a look at how to let the search engines know what’s important and some common coding issues.

How to let search engines know what’s important

When a bot crawls your site, there are a number of cues that direct it through your files.

Like humans, bots follow links to get a sense of the information on your site. But they’re also looking through your code and directories for specific files, tags and elements. Let’s take a look at a number of these elements.


The first thing a bot will look for on your site is your robots.txt file.

For complex sites, a robots.txt file is essential. For smaller sites with just a handful of pages, a robots.txt file may not be necessary — without it, search engine bots will simply crawl everything on your site.

There are two main ways you can guide bots using your robots.txt file.

1. First, you can use the “disallow” directive. This will instruct bots to ignore specific uniform resource locators (URLs), files, file extensions, or even whole sections of your site:

User-agent: Googlebot
Disallow: /example/

Although the disallow directive will stop bots from crawling particular parts of your site (therefore saving on crawl budget), it will not necessarily stop pages from being indexed and showing up in search results, such as can be seen here:

The cryptic and unhelpful “no information is available for this page” message is not something that you’ll want to see in your search listings.

The above example came about because of this disallow directive in

User-agent: Googlebot
Crawl-delay: 3

Disallow: /cgi-bin/

2. Another way is to use the noindex directive. Noindexing a certain page or file will not stop it from being crawled, however, it will stop it from being indexed (or remove it from the index). This robots.txt directive is unofficially supported by Google, and is not supported at all by Bing (so be sure to have a User-agent: * set of disallows for Bingbot and other bots other than Googlebot):

User-agent: Googlebot
Noindex: /example/
User-agent: *
Disallow: /example/

Obviously, since these pages are still being crawled, they will still use up your crawl budget.

This is a gotcha that is often missed: the disallow directive will actually undo the work of a meta robots noindex tag. This is because the disallow prevents the bots from accessing the page’s content, and thus from seeing and obeying the meta tags.

Another caveat with using a robots.txt file to herd bots is that not all bots are well-behaved, and some will even ignore your directives (especially malicious bots looking for vulnerabilities). For a more detailed overview of this, check out A Deeper Look at Robots.txt.

XML sitemaps

XML sitemaps help bots understand the underlying structure of your site. It’s important to note that bots use your sitemap as a clue, not a definitive guide, on how to index your site. Bots also consider other factors (such as your internal linking structure) to figure out what your site is about.

The most important thing with your eXtensible markup language (XML) sitemap is to make sure the message you’re sending to search engines is consistent with your robots.txt file.

Don’t send bots to a page you’ve blocked them from; consider your crawl budget, especially if you decide to use an automatically generated sitemap. You don’t want to accidentally give the crawlers thousands of pages of thin content to sort through. If you do, they might never reach your most important pages.

The second most important thing is to ensure your XML sitemaps only include canonical URLs, because Google looks at your XML sitemaps as a canonicalization signal.


If you have duplicate content on your site (which you shouldn’t), then the rel=“canonical” link element tells bots which URL should be considered the master version.

One key place to look out for this is your home page. Many people don’t realize their site might house multiple copies of the same page at differing URLs. If a search engine tries to index these pages, there is a risk that they will trip the duplicate content filter, or at the very least dilute your link equity. Note that adding the canonical link element will not stop bots from crawling the duplicate pages. Here’s an example of such a home page indexed numerous times by Google:


Setting up rel=”next” and rel=”prev” link elements correctly is tricky, and many people struggle to get it right. If you’re running an e-commerce site with a great many products per category, rel=next and rel=prev are essential if you want to avoid getting caught up in Google’s duplicate content filter.

Imagine that you have a site selling snowboards. Say that you have 50 different models available. On the main category page, users can view the first 10 products, with a product name and a thumbnail for each. They can then click to page two to see the next 10 results and so on.

Each of these pages would have the same or very similar titles, meta descriptions and page content, so the main category page should have a rel=”next” (no rel=”prev” since it’s the first page) in the head portion of the hypertext markup language (HTML).  Adding the rel=”next” and rel=”prev” link element to each subsequent page tells the crawler that you want to use these pages as a sequence.

Alternatively, if you have a “view all” page, you could canonicalize to that “view all” page on all the pagination pages and skip the rel=prev/next altogether. The downside of that is that the “view all” page is what is probably going to be showing up in the search results. If the page takes too long to load, your bounce rate with search visitors will be high, and that’s not a good thing.

Without rel=”canonical,” rel=”next” and rel=”prev” link elements, these pages will be competing with each other for rankings, and you risk a duplicate content filter. Correctly implemented, rel=prev/next will instruct Google to treat the sequence as one page, or rel=canonical will assign all value to the “view all” page.

Common coding issues

Good, clean code is important if you want organic rankings. Unfortunately, small mistakes can confuse crawlers and lead to serious handicaps in search results.

Here are a few basic ones to look out for:

1. Infinite spaces (aka spider traps). Poor coding can sometimes unintentionally result in “infinite spaces” or “spider traps.” Issues like endless URLs pointing to the same content, or pages with the same information presented in a number of ways (e.g., dozens of ways to sort a list of products), or calendars that contain an infinity of different dates, can cause the spider to get stuck in a loop that can quickly exhaust your crawl budget.

Mistakenly serving up a 200 status code in your hypertext transfer protocol secure (HTTP) header of 404 error pages is another way to present to bots a website that has no finite boundaries. Relying on Googlebot to correctly determine all the “soft 404s” is a dangerous game to play with your crawl budget.

When a bot hits large amounts of thin or duplicate content, it will eventually give up, which can mean it never gets to your best content, and you wind up with a stack of useless pages in the index.

Finding spider traps can sometimes be difficult, but using the aforementioned log analyzers or a third-party crawler like Deep Crawl is a good place to start.

What you’re looking for are bot visits that shouldn’t be happening, URLs that shouldn’t exist or substrings that don’t make any sense. Another clue may be URLs with infinitely repeating elements, like:…

2. Embedded content. If you want your site crawled effectively, it’s best to keep things simple. Bots often have trouble with JavaScript, frames, Flash and asynchronous JavaScript and XML (AJAX). Even though Google is getting better at crawling formats like Javascript and AJAX, it’s safest to stick to old-fashioned HTML where you can.

One common example of this is sites that use infinite scroll. While it might improve your usability, it can make it difficult for search engines to properly crawl and index your content. Ensure that each of your article or product pages has a unique URL and is connected via a traditional linking structure, even if it is presented in a scrolling format.

In the next and final installment of this series, we’ll look at how bots are looking at your mobile pages, discuss if you should block bad bots, and dive into localization and hreflang tags. Stay tuned!

The post The ultimate guide to bot herding and spider wrangling — Part Two appeared first on Search Engine Land.

The ultimate guide to bot herding and spider wrangling /ultimate-guide-bot-herding-spider-wrangling-293284 Wed, 07 Mar 2018 15:24:40 +0000 /?p=293284 In Part 1 of a three-part series, Columnist Stephan Spencer does a deep dive into bots, explaining what they are and why crawl budgets are important.

The post The ultimate guide to bot herding and spider wrangling appeared first on Search Engine Land.


This is Part 1 of a three-part series.

We generally think about search engine optimization in relation to humans: What queries are my customers using?

How can I get more bloggers to link to me?

How can I get people to stay longer on my site?

How can I add more value to my customers’ lives and businesses?

This is how it should be.

But even though we live in a world that is increasingly affected by non-human actors like machines, artificial intelligence (AI) and algorithms, we often forget a large part of optimizing a website has nothing to do with people at all.

In fact, many of the website visitors we need to please are actually robots, and we ignore them at our peril!

What is a bot, anyway?

A bot (also known as a spider or crawler) is simply a piece of software that Google (or another company) uses to scour the web and gather information or perform automated tasks.

The term “bot” or “spider” is slightly misleading, as it suggests some level of intelligence. In reality, these crawlers aren’t really doing much analysis. The bots aren’t ascertaining the quality of your content; that’s not their job. They simply follow links around the web while siphoning up content and code, which they deliver to other algorithms for indexing.

These algorithms then take the information the crawler has gathered and store it in a massive, distributed database called the index. When you type a keyword into a search engine, it is this database you are searching.

Other algorithms apply various rules to evaluate the content in the database and decide where a universal resource locator (URL) should be placed in the rankings for a particular search term. The analysis includes such things as where the highly related keywords appear on a page, the quantity and quality of the backlinks and the overall content quality.

By now, you’re probably getting the gist of why optimizing for bots is important.

While the crawler doesn’t decide whether your site will appear in search results, if it can’t gather all the information it needs, then your chances of ranking are pretty slim!

So, how do you wrangle all those crawlers and guide them to where they need to be? And how do you give them exactly what they’re looking for?

First things first: Understanding crawl budget

If you want to optimize your site for bots, you first need to understand how they operate. That’s where your “crawl budget” comes in.

Crawl budget is a term search engine optimization specialists (SEOs) developed to describe the resources a search engine allocates to crawl a given site. Essentially, the more important a search engine deems your site, the more resources it will assign to crawling it, and the higher your crawl budget.

While many commentators have tried to come up with a precise way to calculate crawl budget, there is really no way to put a concrete number on it.

After the term became popular, Google weighed in with an explanation of what crawl budget means for Googlebot. They emphasize two major factors that make up your crawl budget:

  • Crawl rate limit: The rate at which Googlebot can crawl a site without degrading your users’ experience (as determined by your server capacity and so on).
  • Crawl demand: Based on the popularity of a particular URL, as well as how “stale” the content at that URL is in Google’s index. The more popular a URL, the higher the demand, and the more it’s updated, the more often Google needs to crawl it.

In other words, your crawl budget will be affected by a number of factors, including how much traffic you get, the ease with which a search engine can crawl your site, your page speed, page size (bandwidth use), how often you update your site, the ratio of meaningful to meaningless URLs and so on.

To get an idea of how often Googlebot crawls your site, simply head over to the “Crawl: Crawl Stats” section of Google Search Console. These charts/graphs are provided for free from Google and indeed, they are helpful, but they provide a woefully incomplete picture of bot activity on your site.

Ideally, you should analyze your server log files with a program like OnCrawl or Screaming Frog Log Analyser.

It’s important to bear in mind that Google Search Console (GSC) is not a server log analyzer. In other words, there is no capability for webmasters to upload server logs to GSC for analysis of all bot visits, including Bingbot.

There are a few major things to consider when optimizing your crawl budget:

  • The frequency of site updates. If you run a blog that’s updated once a month, don’t expect Google to place a high priority on crawling your site. On the other hand, high-profile URLs with a high frequency of updates (like HuffPost’s home page, for example) might be crawled every few minutes. If you want Googlebot to crawl your site more often, feed it content more frequently.
  • Host load. While Google wants to crawl your site regularly, it also doesn’t want to disrupt your users’ browsing experience. A high frequency of crawls can place a heavy load on your servers. Generally, sites with limited capacity (such as those on shared hosting) or unusually large page weights are crawled less often.
  • Page speed. Slow load time can affect your rankings and drive away users. It also deters crawlers that need to gather information quickly. Slow page load times can cause bots to hit their crawl rate limit quickly and move on to other sites.
  • Crawl errors. Problems like server timeouts, 500 server errors and other server availability issues can slow bots down or even prevent them from crawling your site altogether. In order to check for errors, you should use a combination of tools, such as Google Search Console, Deep Crawl or Screaming Frog SEO Spider (not to be confused with Screaming Frog Log Analyser). Cross-reference reports, and don’t rely on one tool exclusively, as you may miss important errors.

This ends Part 1 of our three-part series: The Ultimate Guide to Bot Herding and Spider Wrangling.  In Part 2, we’ll learn how to let search engines know what’s important on our webpages and look at common coding issues. Stay tuned.

The post The ultimate guide to bot herding and spider wrangling appeared first on Search Engine Land.

YouTube SEO 101: Get started optimizing video /youtube-seo-101-289416 Wed, 10 Jan 2018 19:38:56 +0000 /?p=289416 In this comprehensive guide to YouTube SEO, columnist Stephan Spencer explains the fundamentals of YouTube optimization and explains how to increase visibility and rankings for your videos.

The post YouTube SEO 101: Get started optimizing video appeared first on Search Engine Land.


Based on Alexa traffic rankings, YouTube is the second most visited site on the web, right after Google. Unfortunately, a lot of digital marketers still treat it like any other social media site. But success on YouTube isn’t about posting content, it’s about optimizing your content — just like your website.

It’s easy to find videos with millions of views and videos with almost none that are basically the same. The difference between success and failure often boils down to a few elements.

When it comes to YouTube SEO, a lot of the optimization work can be encapsulated into a process that you can apply to all your old videos and then to each video as you publish it. And you’re about to learn that process.

Here’s what you need to know if you want your content to rank number one on YouTube for the keywords you care about.

The basics

This section contains the essential background information you’ll need to understand before you dive into YouTube optimization tactics.

Start with keyword research

Given that YouTube is a video search engine, you should approach content creation in a strategic way, as you would when optimizing your website. This means conducting keyword research to find out what your audience is interested in and how they talk about it online.

It’s easy to start your YouTube keyword brainstorming. Simply go to YouTube and start typing a keyword in the search box. As you type, you will get popular searches suggested to you by YouTube Suggest, which is the autocomplete feature built into the search box on YouTube. You can take this to another level using the free Ubersuggest tool, which will iterate through the alphabet for the first letter of the next word of your search phrase. Remember to select “YouTube” instead of the default “Web.”

Keyword brainstorming is one thing, but you probably need to be able to compare keywords to each other to see which ones are searched on more frequently. There’s a tool for that, and it’s completely free, provided to us by Google: Google Trends. It’s surprising how many SEO practitioners don’t realize Google Trends has a “YouTube search” option underneath the “Web search” option, which will give you YouTube-specific search volume data. This tool doesn’t give you actual numbers, unfortunately (everything is in percentages), but nonetheless, it is quite handy for comparing keywords to each other.

Track your YouTube search rankings

You probably track your positions in the Google search results for a range of your favorite keywords, but are you doing this with YouTube? If not, you should be! There are many tools for this, both free and paid, so find one that you feel comfortable with so that you can track your progress as you optimize your videos.

Content is king, but consistency is queen

Obviously, to compete with all the other creators in the fast-paced, aggressive world of YouTube, you need great content that stands out from the crowd. While achieving a viral hit is great, remember that YouTube isn’t just about views: You’re looking to build a subscriber base and form long-term relationships with viewers.

How can you accomplish this? By producing quality content and publishing it on a regular schedule. Posting irregularly will only hurt you and result in lost subscribers. If you commit to posting every day, make sure you post every day. If you post once a week at 9:00 a.m. on a Tuesday, never skip a week or post a late video (even if it is only a few hours or the next day).

Short is not sweet

Beware of agencies and production houses that tell you people only watch short, one- to two-minute videos on YouTube. Remember, YouTube’s ultimate goal is to compete with television so they can charge TV-like advertising rates. What they’re looking for is high-quality, long-form content that will allow them to run more ads and keep users on the site for longer. Videos that are at least five minutes in length tend to perform better and have a higher chance of ranking in Google searches.

A key metric to keep an eye on is watch time — not just for each video, but for your channel overall. Ideally, you should be seeing monthly increases in watch time as your channel grows.

The power of playlists

Playlists are an underrated promotional tool on YouTube. While most businesses create playlists around dates, content genres, products and other broad categories, to really take advantage of this feature, you need to go deeper.

Use your keyword research to figure out what people are searching for in your niche, and create playlists based on those topics. If you don’t have much content, you can even create playlists using other people’s videos to drive viewers to your YouTube channel page.

First 48 hours are critical

YouTube’s algorithms are notoriously unforgiving. When you upload a new video, make sure you have all your optimizations ready to go (see below). Come out of the gate strong, or not at all. Don’t publish a video with the intention of optimizing it sometime later. If YouTube can’t get a clear picture of what your video is about, or if you aren’t getting any traction from viewers (in terms of watch time and other engagement metrics), you’ll suffer in the rankings — and it will be hard to recover that lost ground.

While it is possible to go back and fix poorly optimized videos by revising the titles, description, tags, thumbnail, transcript and so on (which I do encourage), much of the damage will have already been done after the first 48 hours have passed. It is incredibly hard to come back from being buried once the algorithm has judged your content as unworthy (please forgive the Thor reference).

How to optimize your videos

Now that you understand the basics, it’s time to get down to business. Here’s how you can optimize your videos for success on YouTube.

The title

The video title should be punchy and should grab the user. It shouldn’t be too wordy — instead, it should concisely convey why the user should bother watching your video. Hit them with the good stuff!

Before you decide on your title, do your keyword research (as described above), and then take a look at your competitors for those keywords. These are the videos you’ll be going up against, so you want your title to be as good as theirs, if not better.

Titles play a large part in the ranking of your video, so make sure they are at least five words long and include the keyword that you want to rank for.

The thumbnail

A video’s thumbnail image is actually more important than the title in terms of attracting the click from the YouTube searcher. You could do every other thing right for your SEO, but if you have an unappealing thumbnail, no one is going to click on your video.

Think about it: The thumbnail is the only image that gives people a sense of what they’re about to invest their time in watching. If it looks unprofessional or boring, people aren’t going to consider it a good use of time.

For the best results, go with a “custom thumbnail” (you will need to be verified by YouTube in order to do this) and have that thumbnail image include graphical text.


  • Customize your thumbnail image with titles/fun graphics.
  • Have professional shots taken with the thumbnail in mind. (Note: You don’t have to use a frame from the video as the basis for the thumbnail.)
  • Make it intriguing.
  • Ensure it is well-lit.

This buttercream frosting video thumbnail draws the eye with its well-balanced, bright colors that aren’t overwhelming, the very visual title and the nicely set up photo.


  • Have an intrusive logo.
  • Use clashing colors.
  • Have a random, unprofessional-looking still.
  • Make your thumbnail all text.

This cupcake decorating video thumbnail isn’t effective because it’s confusing. The woman is looking at some off-screen person, the moment looks unpolished, and we aren’t sure what’s going on.


Many people make the mistake of only writing a few sentences for the description. This is your chance to expand on the information in the video with links, calls to action and performer bios. If you want people to click on a link to your website, include it “above the fold,” before the “Show more” prompt. Also, include some sort of enticing hook in that first sentence that will get people to click “Show More” to see the rest of your video’s description.

Take a look at this description of an HGTV video above and below the fold when one hits “Show More” or “Show Less.” You’ll want a long description so users can get more insight into the video; don’t be afraid to include lots of information. This also gives you another shot at including relevant keywords.


The video transcript (i.e., captions) serves as additional copy that is considered in YouTube’s rankings algorithm. Don’t rely on YouTube’s automated transcription process — there are going to be errors in that transcript, guaranteed. Either proofread and edit that automated transcript or use a transcription service or a VA (Virtual Assistant) to create a transcript of the video. If you do the latter, remember that it needs to be time-stamped to match the audio track.


Did you know that you can provide foreign language translations of your video in the same time-stamped format of your transcript? It’s a great way to globalize your content without having to reshoot your videos. It allows foreign language viewers to watch your video with subtitles (closed captioning), and it allows your video to rank for keywords in that foreign language. For example, you could translate your video into Spanish and upload the translated transcription.

YouTube gives you the ability to include various metadata in multiple languages, such as the title, tags and descriptions, in addition to the closed captioning.


Tagging isn’t rocket science. Make sure you use phrases as well as single keywords; for example, if your video is about surfing at Malibu Beach, tag it with “surfing,” “Malibu Beach” and “surfing at Malibu Beach.” Tags aren’t visible on YouTube by default, but you can view the tags on YouTube videos using the free vidIQ Chrome extension. Have fun mining your competitor’s content for the best tags!


Make sure you are linking in the description to everywhere that you want your potential fan base to go: all of your social channels, your site, other videos of yours (to boost the overall viewership and get more subscribers) and wherever else you might want to send viewers, like to a squeeze page. Choose your most important link to display above the fold in the description. You can also promote some of these destinations with YouTube cards, which is a perfect segue to my next point.

Call to action

The end of your video should practically subscribe for the user. Give them a one-click option to subscribe, and then tell them why they should. Life coach and motivational speaker Marie Forleo is fantastic at this. At the end of her videos, she gives a neat little outro like “If you like this video and found its tips helpful, subscribe!” She even has a little arrow pointing to the subscribe button just in case viewers don’t get the hint. You need to be that obvious.

Hovering over or clicking on the picture of Forleo’s face gives users an easy way to subscribe to her YouTube channel.

Subscriptions send a big signal to Google: If people subscribe because of this video, there must be something worthwhile about it. Forleo also has videos playing in boxes at the very end of the video — in what is called the “end screen” — that send you to other videos of hers. This is a great way to get views to accumulate for your other videos, and it gives you that extra chance of landing a subscriber.


Once you’ve optimized and uploaded your videos, you’ll want to be able to monitor and analyze their performance.

YouTube Analytics

YouTube Analytics is available at YouTube Analytics is great for learning more about who is watching your videos. Some examples of the data you can find are traffic sources, demographics and what percentage of your watchers are subscribers. This lets you know where to focus your energies and resources. Are a large number of your viewers subscribers that follow you closely? Perhaps create some content that caters specifically to them.

You’ll also want to combine YouTube Analytics with your Google Analytics, which gives you access to more features. To see activity on your channel page in Google Analytics, simply add your Google Analytics embed code.

Subscriber conversion is key

There are plenty of metrics to keep an eye on in YouTube, but one key metric to watch is your subscriber conversion. If your goal is to build your audience, then you’ll want to know which videos are so compelling that they convince a viewer to hit “subscribe.” Thankfully, YouTube Analytics will now show you exactly which video a subscriber came from. Use this insight to give your audience more of what they want.

Third-party analytics tools

You may find yourself in need of more data than YouTube Analytics and Google Analytics can provide. There are a variety of tools out there, both free and paid, which can provide deeper insights into YouTube performance metrics, such as rankings, view count, comments, likes, dislikes, video replies and favorites. This kind of data can help you better optimize your video content, as well as inform content creation and distribution strategies. (For instance, perhaps you find that the highest view rates are happening on the weekends, so you decide to post the next video on the weekends to get more viewers.)

Your marching orders

First, spruce up all the existing YouTube videos on your channel. Even if they’ve been up for years, put in the time to clean up their appearance, make use of a few of YouTube’s tools, as well as a few third-party ones, and provide for a better viewer experience. You can still see improvements in your channel’s performance.

Then, develop a new workflow for new videos that you’re going to publish, including all these tools and tips.

If you’re serious about getting more YouTube views, subscribers and rankings, it’s essential to invest time in video optimization. The best part, undoubtedly, is the low barrier to entry to being a YouTube SEO practitioner. Just start ticking all of YouTube’s boxes, and you’re well on your way!

The post YouTube SEO 101: Get started optimizing video appeared first on Search Engine Land.

Understanding the interplay of SEO and a 5-star reputation /understanding-interplay-seo-5-star-reputation-286711 Wed, 15 Nov 2017 15:36:45 +0000 /?p=286711 How do online reviews impact search visibility, and what can you do to improve your online reputation? Columnist Stephan Spencer addresses these questions and more.

The post Understanding the interplay of SEO and a 5-star reputation appeared first on Search Engine Land.


Is your online reputation fully optimized? Online reviews are a fundamental part of local search. That’s because 97 percent of consumers read online reviews for businesses, and 85 percent report that they trust online reviews as much as personal recommendations, BrightLocal’s 2017 Local Consumer Review Survey found.

It’s not a matter of if your business will get reviews, but when. Poor reviews can sink even the strongest businesses. Here’s a guide to understanding the interplay between reviews, local search and earning (or keeping) a five-star reputation.

How reviews influence local search

Online reviews are not only influencing consumers, they’re also influencing search engine results. According to this year’s Moz Local Ranking Factors Survey, local search experts believe that review signals (in terms of the quantity, velocity, diversity and so on) are estimated to determine about 13 percent of the local pack rankings and 7 percent of the local organic rankings in Google.

The three pillars to local search are relevance, proximity and authority. How can reviews influence these pillars? By adding content and context.

Unlike local business websites, reviews are made up entirely of user-generated content. The content provides unbiased details and additional keywords to associate with the business in question, which contributes to relevance. The reviews provide Google with context as to which businesses merit the greatest visibility and which deserve to be buried. Google also looks to reviewers for confirmation that the local business location and details are accurate, thus improving proximity.

While there’s no denying the power of links in SEO, strong reputations with reviewers can also convey authority to search engines. It’s easy for businesses to focus on the negatives with online reviews. But they’re also helping businesses rank well on Google and pursue their target audience.

Controlling business information

Knowledge is power for Google. It aggregates information from review sites in order to improve search results. But how does Google make sure that this information is accurate?

Business information is used for two types of search: organic and local. On the organic side, Google pulls information from your website and reviews from sites like Facebook, Yelp and industry-specific sites. Google also pulls from Wikipedia and Wikidata. You should monitor all of these sources to make sure they are accurate and up to date.

The local side works differently. They use Google My Business to verify things like business category, hours, photos and general information. Business owners can control this flow of information by logging into the portal and making updates.

However, Google takes it a step further. They use primary data suppliers (Acxiom, Localeze, Infogroup and Factual) to verify business information. Google also has a second tier of data providers for local search, which includes directories and review sites like Facebook, and Yelp.

Reviews have the power to override business information on Google. For instance, let’s say you don’t list your business hours anywhere online. If a highly trusted Yelp reviewer reports that your business is closed on Sundays, then it’s possible that Google will trust their data and update your hours.

Google may give review sites a lot of power, but business owners can still control their own narrative. Tools like Google My Business and Yelp’s free Business Owner account allow you to make changes to your listings. Moz Local is also a powerful tool — you can create a business listing to be distributed across the local search ecosystem, and Moz will submit your data across all the sources and allow you to make changes to any incorrect data that pops up on the web.

Google’s Knowledge Panel

It’s easier than ever to access online reviews. You don’t even have to go to review sites to see your business’s reputation. They’re now being integrated into local search.

Google’s Knowledge Panel showcases reviews from Google, Facebook and other industry-related sites. The Knowledge Panel is an instrumental tool for users, especially on mobile platforms. It helps them research a business, get key information and take action without even clicking on their website.

Let’s say someone is doing a search for “Infusion Coffee” in Tempe, Arizona. This is what will be displayed:

Potential customers can call, get directions, order and look at Facebook and Google reviews without ever visiting Infusion’s website. The Google Knowledge Panel can also drive local search. Let’s say a separate user is looking for coffee in Tempe, but they don’t know about Infusion. This is what they will see instead:

The results are once again determined by relevance, authority and proximity. But it’s also clear that reviews are a factor in search rank. The top hit, Cartel Coffee Lab, has a 4.5-star reputation and over 400 reviews. Users can even refine their search by star rating.

Yes, you can control basic information on a Google My Business Page. But that only takes you so far. User-generated content and reviews are featured prominently in the Knowledge Panel. Translation: There’s no substitute for a stellar online reputation.

Promoting reviews

You might notice that a quick Google search of your industry yields mixed results. Some businesses have star ratings attached to their organic results, while others lack reviews altogether.

Do you want to add reviews to your organic results? The secret lies in structured data markup from This is HTML markup that gives search engines more information about websites. Rich snippets that appear in the Google results can be composed of text, images, and/or review stars and give the searcher more details to help him/her choose the best, most relevant listing.

Google can display review stars as a rich snippet if it discovers valid reviews or ratings markup on your page. However, businesses have to include reviews on their website in order for them to be displayed, and there are some strict guidelines to follow in this regard:

• Ratings must be sourced directly from users.

• Don’t rely on human editors to create, curate or compile ratings information for local businesses. These types of reviews are critic reviews.

• Sites must collect ratings information directly from users and not from other sites.

So, copying and pasting review text from other review sites (like Yelp) is prohibited under Google policy. The reviews must be unique to your site and not duplicated on other platforms.

Controlling bad reviews

Good reviews can be a valuable tool for driving conversions. But what about bad reviews? It’s true that bad reviews can damage businesses — many potential customers will not purchase from a business with negative reviews.

Some business owners have tried to control bad reviews, but there’s no simple solution. Businesses looking for a quick fix might seek out a pay-for-review site. Not only are these sites illegal, but using them might end up in a penalty from Google and other sites.

Incentivizing reviews can be just as damaging. It’s a violation of Yelp and Google’s terms of service. “Astroturfing,” or creating fake positive reviews for reputation management, is a big taboo in the digital space, and it can lead to penalties from Google.

So, what’s a business to do if they’re stuck with bad reviews? The simple answer is to build a solid reputation. Focus on enhanced customer service, set the right expectations and listen to customer feedback.

You can also score your customer service internally. Net Promoter Score provides a structured way to solicit and analyze customer feedback, giving you a single customer satisfaction metric. Continue to keep an eye on this metric. You can see what is helping (or hurting) your online reviews and adjust your strategies from there.

Sometimes you get lucky and can make a negative review go away simply by offering to make things right for the customer. Fellow Search Engine Land columnist and online reputation strategist Chris Silver Smith describes how to turn things around when responding to a bad review.

Yelp and some of the other prominent reviews sites allow owners to post responses to customer reviews. On Yelp, if you respond to the reviewer and offer to address their issues, then they hopefully will post a follow-up of how you addressed their complaint and exceeded their expectations — making the combined review storyline even more beneficial to your business than an unbroken line of positive reviews.

GetFiveStars is an invaluable tool for improving your reputation on certain sites. It uses your Net Promoter Score in the decision tree; customers/clients are interacted with differently based on their satisfaction level. The service also lets you automate the outreach for feedback. You can focus on getting legitimate, quality reviews and better understand what’s contributing to your online reputation.

Online reviews are embedded into modern search engines. Reputation building doesn’t happen overnight. You must take a proactive approach and use reviews to your advantage. You can’t hide bad reviews, but good reviews can be an integral part of your local search strategy. Online reviews are part of your business, for better or worse. It’s up to you to make the most of them.

The post Understanding the interplay of SEO and a 5-star reputation appeared first on Search Engine Land.

Anatomy of a Google search listing /anatomy-google-search-listing-282725 Wed, 20 Sep 2017 16:24:08 +0000 http:/?p=282725 There’s no perfect method to snagging the top overall search result for every relevant query, but columnist Stephan Spencer believes that understanding each element of Google's search listings can give you the best chance for success.

The post Anatomy of a Google search listing appeared first on Search Engine Land.


Are you looking to dominate in Google search results?

Your strategy needs to involve more than keyword research and a savvy AdWords campaign. In order to make the most of your Google presence, you need to craft a search result that entices users to click through to your web page. This is a crucial yet often-ignored aspect of SEO.

Believe it or not, small changes to your Google listing can make a big difference when it comes to click-through rate. Here is a detailed guide to better understanding a basic Google search listing.


It’s no secret that page titles can heavily influence user behavior. But did you know that Google doesn’t always show a web page’s title tag? The title that appears in search results might be influenced by several factors. Google looks to find titles that are short, descriptive and relevant to search queries. Though they most commonly use a page’s title tag, they can also pull from page content or links pointing to the page. Try to keep your title tag short, and provide context to users in order for it to be displayed.

Is your title is being cut off in the Google search results? You might need to shorten it. Maximum length for a title tag is 600 pixels, which is about 70 characters (78 for mobile); otherwise, Google will truncate it. Truncated titles are indicated by an ellipsis.


You may have noticed that Google often omits parts of a URL. Google truncates URLs by removing their middle sections, even when the URL is only one line. Use short but meaningful URLs whenever possible to maximize their impact in the Google SERPs (search engine results pages).

The URL is often displayed as clickable breadcrumb links. In these instances, Google displays the site’s internal hierarchical linking structure from the on-page breadcrumb navigation when those breadcrumbs are marked up using breadcrumb semantic markup.

Google search listings may also include time stamps under their URL. This is a common practice for news publishers, blogs and other sites that wish to bring attention to the freshness of their content and provide the date of publication or date of last update.

To integrate this, you need to add a time stamp into your page copy. You can provide Google with specific times by adding comment tags through the W3 Total Cache plugin for WordPress, which will appear something like this: Served from user @ 2017-03-03 17:15:25.

You can also manually add a time tag to a page or blog post using structured data markup. Otherwise, Google will use the publication date, which is easy for Google to determine with WordPress blogs.

Here is an example of the structured data HTML markup:

Cached link

The cached link is a fail-safe in case your website is unavailable; it is a snapshot that Google takes of each page and adds to its cache. It also serves as a backup in case a page is deleted, temporarily down or failing to load.

Google has made changes to the Cached link location in recent years. Cached links are now stored next to the URL in a green down arrow.

The cached link will be missing for sites that have not been indexed, as well as for sites whose owners have requested that Google not cache their content. Owners can block their page from being cached by using a meta-robots “noarchive” tag.

What’s the benefit to doing this? For one thing, it can prevent users from copying your content for redistribution; people can still copy and paste content from a cached page even if you’ve blocked these functions on your site. Sites with paid content often block cached pages to prevent their content from being seen for free. Fortunately for them, pages being cached or not by Google have no bearing on overall ranking.


A snippet is the description for the page that appears underneath the title. Google can obtain the snippet from either the page’s meta description tag or contextual information on the page. Like titles, the search snippet is based on the query and can be altered by different keyword searches.

For example, in a search for “meta description,” the snippet below is returned for the search result.

Searching for “160 character snippet” in Google returns a very different snippet for a search result for the same page as above.

Keyword bolding (known by us information retrieval geeks as “Keywords in Context” or KWIC) is also query-based and will often appear in the snippet, depending on the search term.

Google currently limits a snippet to around 156 characters per search result (or 141 with a date stamp). The actual limit, in terms of total pixel width, is 928 pixels (based on 13px Arial). Snippets will be truncated and end with ellipses when they run over this limit.

Often, Google will choose not to use a meta description in favor of a more relevant snippet. The snippet can come from anywhere on your page (including disparate parts of the page), so it’s important to pay close attention to your content — especially around common keywords.

It’s still worth it to carefully craft a meta description. In many cases, Google will still show a quality meta description for popular searches. What makes it a quality meta description? It’s well-written, includes popular search terms and avoids redundant information, such as repetition of the title tag. Since the snippet is query-based, you need to incorporate popular, relevant search terms into both your meta description and your on-page content.

There are also times when a snippet does not appear. Why does this happen? It’s because that URL is blocked using a disallow in the site’s robots.txt file. In such cases, Google will display a message in the snippet’s place stating, “A description for this result is not available because of this site’s robots.txt.”

You can prevent this with noindex instead of disallow. That way, Google can still crawl the page, but it will not add it to its search engine index or display it in the SERPs.

Conversely, you can opt out of snippets by using the <meta name=”googlebot” content=”nosnippet”> tag on your page.


Google sitelinks are additional sub-listings that appear underneath the first search result. For instance, if a user were to search for “Search engine land,” this is what they would see:

Sitelinks are intended to help users navigate around websites. In this instance, the user might want to jump to the latest industry news rather than navigating through Search Engine Land’s home page.

You might have noticed a “more results” feature in the above screen shot. This restricts the results to only coming from indexed pages on that specific site. In this example, the More results from >> link leads to a refined search of just pages on for the query “Search engine land.” This is accomplished using the Google site: search operator.

Google allows up to six automated sitelinks, but they are far from guaranteed for poorly optimized sites. Websites with a clear hierarchy and structure and a unique brand name are more likely to have sitelinks. As a result, you’re more likely to see sitelinks appear in search results after typing in a specific brand.

You’ll notice that in this instance, a search for The New York Times renders both sitelinks and a search box. If you wish to include a search engine, you can do so by embedding structured data on your website.

Though the system is automated, the best way to get sitelinks is to reach the top overall position for your website name. A downside to using different domains (or subdomains) in your web strategy is that they won’t be included in the sitelinks. Still, the impact of sitelinks is undeniable. AdWords advertisers with sitelinks see a 20-50 percent boost in click-through rate when the search is a branded term.

Final thoughts

Small changes to a search result can have a big impact on a site’s traffic. Google search is an ever-evolving science, so rules that exist today might not exist tomorrow. For the time being, you can follow this guide to help improve your presence in the Google SERPs.

The post Anatomy of a Google search listing appeared first on Search Engine Land.

Featured snippets: How much do you really know about them? [QUIZ] /featured-snippets-much-really-know-quiz-279806 Fri, 28 Jul 2017 15:39:18 +0000 http:/?p=279806 Think you're an expert on featured snippets? Then put your money where your mouth is and take this quiz, created by columnist Stephan Spencer!

The post Featured snippets: How much do you really know about them? [QUIZ] appeared first on Search Engine Land.


The post Featured snippets: How much do you really know about them? [QUIZ] appeared first on Search Engine Land.