m88 asia
Aleh Barysevich – Search Engine Land News On Search Engines, Search Engine Optimization (SEO) & Search Engine Marketing (SEM) Thu, 23 May 2019 16:51:14 +0000 en-US hourly 1 https://wordpress.org/?v=5.2.2 How crawl budget has changed in the last 2 years /how-crawl-budget-has-changed-in-the-last-2-years-316862 Thu, 23 May 2019 15:05:27 +0000 /?p=316862 Here's what you need to know about crawl budget and what it all means for your optimization efforts now.

The post How crawl budget has changed in the last 2 years appeared first on Search Engine Land.

]]>
Understanding crawl budget is an often overlooked part of SEO. But a two-year-old post my team wrote about the topic is practically ancient history in the SEO industry. So, in this article, I’ll be explaining how our understanding of crawl budget has changed in the past couple years, what’s stayed the same, and what it all means for your crawl budget optimization efforts.

What is crawl budget and why does it matter?

Computer programs designed to collect information from web pages are called web spiders, crawlers or bots. These can be malicious (e.g., hacker spiders) or beneficial (e.g., search engine and web service spiders). For example, my company’s backlink index is built using a spider called BLEXBot, which crawls up to 7.5 billion pages daily gathering backlink data.

When we talk about crawl budget, we’re actually talking about the frequency with which search engine spiders crawl your web pages. According to Google, crawl budget is a combination of your crawl rate limit (i.e., limits that ensure bots like Googlebot don’t crawl your pages so often that it hurts your server) and your crawl demand (i.e., how much Google wants to crawl your pages).

Optimizing your crawl budget means increasing how often spiders can “visit” each page, collect information and send that data to other algorithms in charge of indexing and evaluating content quality. Simply put, the better your crawl budget, the faster your information will be updated in search engine indexes when you make changes to your site.

But don’t worry. Unless you’re running a large-scale website (millions or billions of URLs) then you will likely never need to worry about crawl budget:

So why bother with crawl budget optimization? Because even if you don’t need to improve your crawl budget, these tips include a lot of good practices that improve the overall health of your site.

And, as John Mueller explains in that same thread, the potential benefits of having a leaner site include higher conversions even if they’re not guaranteed to impact a page’s rank in SERPs.

 

What’s stayed the same?

In a Google Webmaster Hangout on Dec. 14, 2018, John was asked about how one could determine their crawl budget. He explains that it’s tough to pin down because crawl budget is not an external-facing metric.

He also says:

“[Crawl budget] is something that changes quite a bit over time. Our algorithms are very dynamic and they try to react fairly quickly to changes that you make on your website … it’s not something that’s assigned one time to a website.”

He illustrates this with a few examples:

  • You could reduce your crawl budget if you did something such as improperly setting up a CMS. Googlebot might notice how slow your pages are and slow down crawling within a day or two.
  • You could increase your crawl budget if you improved your website (by moving to a CDN or serving content more quickly). Googlebot would notice and your crawl demand would go up.

This is consistent with what we knew about crawl budget a couple of years ago. Many best practices for optimizing crawl budget are also equally applicable today:

1. Don’t block important pages

You need to make sure that all of your important pages are crawlable. Content won’t provide you with any value if your .htaccess and robots.txt are inhibiting search bots’ ability to crawl essential pages.

Conversely, you can use a script to direct search bots away from unimportant pages. Just note that Googlebot may assume you’ve made a mistake if you disallow lots of content or if a restricted page receives a lot of incoming links and it may still crawl these pages.

The following meta tag in the <head> section of your page will prevent most search engine bots from indexing a page on your site: <meta name=”robots” content=”noindex”>

You can also block specifically Google from crawling your page with the following meta tag: <meta name=”googlebot” content=”noindex”>

Alternatively, you can return a “noindex” X-Robots-Tag header which instructs spiders not to index your page: X-Robots-Tag: noindex

2. Stick to HTML whenever possible

Googlebot has gotten a lot better at crawling rich media files like JavaScript, Flash and XML but other search engine bots still struggle with a lot of these files. I recommend avoiding these files in favor of plain HTML whenever possible. You may also want to provide search engine bots with text versions of pages that rely heavily on these rich media files.

3. Fix long redirect chains

Each redirected URL squanders a little bit of your crawl budget. Worse, search bots may stop following redirects if they encounter an unreasonable number of 301 and 302 redirects in a row. Try to limit the number of redirects you have on your website and use them no more than twice in a row.

4. Tell Googlebot about URL parameters

If your CMS generates lots of dynamic URLs (as many of the popular ones do), then you may be wasting your crawl budget – and maybe even raising red flags about duplicate content. To inform Googlebot about URL parameters that your website engine or CMS have added that don’t impact page content, all you have to do is add parameters to your Google Search Console (go to Crawl > URL Parameters).

5. Correct HTTP errors

John corrected a common misconception in late 2017, clarifying that 404 and 410 pages do in fact use your crawl budget. Since you don’t want to waste your crawl budget on error pages — or confuse users who try to reach those pages — it’s in your best interest to search for HTTP errors and fix them ASAP.

6. Keep your sitemap up to date

A clean XML sitemap will help users and bots alike understand where internal links lead and how your site is structured. Your sitemap should only include canonical URLs (a sitemap is a canonicalization signal where Google is concerned) and it should be consistent with your robots.txt file (don’t tell spiders to crawl a page you’ve blocked them from).

7. Use rel=”canonical” to avoid duplicate content

Speaking of canonicalization, you can use rel=”canonical” to tell bots which URL is the main version of a page. However, it’s in your best interest to ensure that all of the content across various versions of your page line up – just in case. Since Google introduced mobile first indexing back in 2016, they often default to the mobile version of a page being the canonical version.

8. Use hreflang tags to indicate country/language

Bots use hreflang tags to understand localized versions of your pages, including language- and region-specific content. You can use either HTML tags, HTTP headers, or your sitemap to indicate localized pages to Google. To do this:

You can add the following link element to your page’s header: <link rel=”alternate” hreflang=”lang_code” href=”url_of_page” />

You can return an HTTP header that tells Google about the language variants on the page (you can also use this for non-HTML files such as PDFs) by specifying a supported language/region code. Your header format should look something like this: Link: <url1>; rel=”alternate”; hreflang=”lang_code_1

You can add the <loc> element to a specific URL and indicate child entries that include each localized version of the page. This page will teach you more about how to set up language – and region-specific pages that will help search engine bots crawl your page.

What’s changed?

Two main things have changed since we wrote that original article in 2017.

First, I no longer recommend RSS feeds. RSS had a small resurgence in the wake of the Cambridge Analytica scandal as many users shied away from social media algorithms – but it’s not widely used (except maybe by news reporters) and it’s not making a significant comeback.

Second, as part of the original article, we conducted an experiment that suggested a strong correlation between external links and crawl budget. It seemed to suggest that growing your link profile would help your site’s crawl budget grow proportionally.

The aforementioned Google Webmaster Hangout seemed to corroborate this finding; John mentions that a site’s crawl budget is “based a lot on demand from our side.”

But when we tried to update the study on our end, we couldn’t recreate those original findings. The correlation was very loose, suggesting that Google’s algorithm has grown quite a bit more sophisticated since 2017.

That said, please don’t read this and think, “Great, I can stop link building!”

Links remain one of the most important signals that Google and other search engines use to judge relevancy and quality. So, while link building may not be essential for improving your crawl budget, it should be a priority when you want to improve your SEO.

And that’s it! If you want to learn more about crawl budget, I recommend checking out Stephan Spencer’s three-part guide to bot herding and spider wrangling.

The post How crawl budget has changed in the last 2 years appeared first on Search Engine Land.

]]>
Why we shouldn’t forget about PageRank in 2019 /why-we-shouldnt-forget-about-pagerank-in-2019-315443 Mon, 15 Apr 2019 18:07:14 +0000 /?p=315443 Before we talk about anything else, let’s address the elephant in the room. The last official public PageRank update happened in December 2013. In October 2014, Google’s John Mueller confirmed what we’d long suspected – that Google Toolbar PageRank was officially dead. The final nail in the coffin came two years later when Google removed […]

The post Why we shouldn’t forget about PageRank in 2019 appeared first on Search Engine Land.

]]>
Before we talk about anything else, let’s address the elephant in the room.

The last official public PageRank update happened in December 2013. In October 2014, Google’s John Mueller confirmed what we’d long suspected – that Google Toolbar PageRank was officially dead. The final nail in the coffin came two years later when Google removed Toolbar PageRank from its browser.

So, understandably, a lot of people roll their eyes when they see “news” about PageRank in 2019. And a lot of people are content to let PageRank remain a relic of the past.

But even though Toolbar PageRank is gone, Google’s Gary Illyes confirmed in 2017 that PageRank* is still a ranking signal (albeit one of hundreds that Google uses).

*(To avoid confusion, I’ll be using the term “Toolbar PR” to refer to the now non-existent public version and “PageRank” to refer to Google’s behind-the-scenes metric.)

We also have other evidence that Google is still using and updating PageRank behind the scenes: In April 2018, they updated their PageRank patent and filed for a continuation.

Of course, just because Google has a patent, it doesn’t mean they use it. But there’s enough evidence here to suggest PageRank still exists and it’s still a metric that matters, even if the general public can no longer view their 1 through 10 Toolbar PR score. PageRank is far from the only metric that matters, but that doesn’t mean we should ignore it entirely or pretend it doesn’t impact rankings.

Which brings us to the topic at hand: If we know PageRank is still a ranking signal but we don’t know what changes Google is making to their algorithm or how impactful PageRank is in 2019, what can we do about it?

This is where SEOs generally fall into one of two camps.

  1. You ignore it and focus exclusively on the metrics you can accurately measure that you know will improve your website (in which case, this article won’t be of much use to you).
  2. You use a new metric that strongly correlates with PageRank that will help you make educated guesses and optimize for the ranking signal we know still exists (in which case, read on!).

There’s no one-to-one substitution for Toolbar PR available and no foolproof way to calculate PageRank. I’m not going to peddle snake oil and tell you such a thing exists.

Instead, this article will give you a (very brief) rundown of PageRank’s history, explain how it is (or was) calculated, and then talk about a recent study that shows the correlation between these alternate solutions and rankings. We’ll end with some basic tips that should help you improve your PageRank (in theory, at least).

Let’s jump into a (short and sweet) history of PageRank.

What is PageRank?

If you had even a passing familiarity with SEO over the past few years, you’ll probably recognize Toolbar PR. Toolbar PR took the more complicated behind-the-scenes PageRank metric and condensed it down into a zero to 10 score that was easy to understand: the higher your number, the better your page.

What is PageRank? The TL;DR version is that it’s a way for Google to rank web pages according to importance as determined by the number and quality of a page’s inlinks.

So, let’s pretend your page has a PageRank of 10/10 and includes links to five other pages. According to Google’s original formula, 85% of your page’s PageRank would be divvied up between each of the pages you linked out to. 8.5/5 = 1.7.  So each page would receive a PageRank of 1.7/10 from your page.

(This is a very simplified look at how PageRank was calculated. If you’re interested in the math, I recommend brushing up on How PageRank Really Works by Dixon Jones. It’s a fascinating read.)

Suffice to say that when SEOs talk about the “quality of a link” or “link authority” or “link juice” or use phrases like “not all links are created equal,” this is part of what we’re talking about.

Of course, Google’s changed a lot about how they rank pages over the years. It would be foolish to assume that PageRank hasn’t undergone a similar evolution. Nevertheless, it’s a safe bet that the fundamental concept and underlying goals of PageRank have remained fairly constant over the years.

Why Toolbar PR died

In theory, PageRank sounds like a good way to find out which pages users are looking for and ensure that users are shown the best and safest pages that fit their search criteria. And evidently, Google is still using PageRank in some capacity in their ranking algorithm.

But, as I’ve already mentioned, by late 2013 the public-facing tool was practically pushing up daisies. So what went wrong? The short answer is: SEO went too far.

With the advent of Toolbar PR, PageRank was adopted as the ranking signal to optimize for. What came next was a wave of link farms and link spam. Unscrupulous SEOs tried to game the system and artificially bolster their PageRank score. Buying “high PR” links became its own industry and agencies began selling the service to companies around the world.

In response, Google cracked down on black-hat SEO practices and began deprecating PageRank. They also became tighter lipped about their ranking process. It wasn’t long before Google removed PageRank data from Webmaster Tools (now Google Search Console).

Google explained why many times:

“We’ve been telling people for a long time that they shouldn’t focus on PageRank so much; many site owners seem to think it’s the most important metric for them to track, which is simply not true. We removed it because we felt it was silly to tell people not to think about it, but then to show them the data, implying that they should look at it. :-),” said Google’s Susan Moskwa.

In 2016, Toolbar PR was officially gone.

So, why are we talking about PageRank today if it’s been out of sight and mind for the past three years? Because, in the absence of Toolbar PR, SEOs have devised many strength-of-domain link-based metrics we can use to approximate PageRank.*

*(Again, to be clear, none of these are exact replicas of PageRank, nor do they tap into what Google is doing behind the scenes. But, as thousands of people have discovered, they’re still useful metrics when you’re performing SEO and looking for areas to improve.)

Today, we have access to more third-party replacement metrics than ever before, and more are appearing all the time. This begs the question: Which is the best metric to use in a majority of SEO cases (i.e., what’s the new “gold standard?”)

To find out, it’s become standard practice to measure these metrics against tangible SEO results – SERP positions in particular. This way, we can see which metric more closely corresponds to real-world results.

Domain InLink Rank Correlation results

We want Domain InLink Rank to reflect the ranking power of pages as accurately as possible so SEOs gain a better understanding of the quality of their domains and pages. To test the validity of our metric, we regularly conduct Domain InLink Rank Correlation Studies.

Our most recent study took place between March 4-6, 2019. During that time, we compiled a list of 1,000,000 URLs taken from the top 30 positions in Google SERPs for 33,500 queries. We then calculated Domain InLink Rank for each SERP.

We found that the correlation between InLink Rank and SERP ranking was extremely high: 0.128482487. I won’t go into the full details here but you can check out the study if you’re interested in our specific methodology.

Suffice to say that the metric correlates strongly enough with actual Google rankings.

How to improve PageRank

Even though we can’t see our PageRank score or measure it directly, there are a few best-practice steps you can take to preserve and increase your PageRank:

  1. Build quality backlinks. Regularly run backlink analysis (probably using one of the tools mentioned above) and make sure the link juice flowing into your site is from high-PageRank pages. As a side note, there’s a common belief among some SEO professionals that some links may not pass PageRank at all – and some may even pass negative PageRank (something close to Spam Rank). The best thing you can do to avoid these problems is just to keep your link profile clean.
  2. Maintain a shallow site structure. You should keep your most important pages as close to your high-PageRank pages as possible. On most websites, this will be your homepage but that may not be the case for sites with internal landing pages that are higher quality and have more external links compared to homepages.
  3. Follow best-practices for links coming out of a page. Carefully consider where links live on a page; links appearing in your content are more valuable than navigational links. And keep the number of links pointing out of each of your pages to other internal and external pages down to a reasonable number.
  4. Tell users where links lead. Using proper anchor text will both help users navigate your content and help your SEO. Just keep it contextual and avoid keyword stuffing. When done organically, placement of relevant keywords in the anchor text tells search engines about your page’s content.
  5. Leverage linkless mentions. As semantic search gets smarter, linkless mentions will probably grow in importance. I’m not saying they’re a ranking signal yet but we have received hints that Google has started considering online brand mentions in its search algorithm.

Conclusion

When you’re seeking new ways to improve your rankings, it helps to consider all the signals we know about and factor those into the decisions you make that improve your website experience – both for users and search engines.

InLink Rank and other similar metrics are a good starting point for evaluating the ranking potential of particular domains: industry studies have proved these metrics to correlate strongly enough with SERP rankings.

Today, links are still a vitally important part of SEO. If you understand this, you know that many of our industry’s best-practice optimizations have an impact on PageRank. We may not be able to measure it directly but it doesn’t hurt to remember that the PageRank formula is still important in 2019.

The post Why we shouldn’t forget about PageRank in 2019 appeared first on Search Engine Land.

]]>
3 easy internal linking strategies for keywords with different search volumes /3-easy-internal-linking-strategies-for-keywords-with-different-search-volumes-311890 Mon, 11 Feb 2019 13:00:23 +0000 /?p=311890 Contributor Aleh Barysevich breaks down three strategies you can use to boost the effectiveness of your internal link building campaigns to fulfill specific SEO-related goals.

The post 3 easy internal linking strategies for keywords with different search volumes appeared first on Search Engine Land.

]]>
Most savvy business owners and content marketers understand the importance of external links. They’re a crucial ranking factor (as evidenced by new studies year after year, such as this brand new Stone Temple Consulting Study) and they’re a strong trust signal from other high-quality websites.

Where a lot of businesses stumble, however, is using internal links to direct link equity to where it will have the biggest impact. Internal links don’t earn you link equity-like external links do, but they’re essential for directing traffic to pages that traditionally attract fewer links or need a much-needed boost in SERPs.

In this article, we’ll break down the dos and don’ts of a good link structure, why you need an internal link strategy, and three different strategies you can use to target keywords with varying levels of competition and search volumes.

How do you create a good internal link structure?

What your internal link structure looks like will vary depending on your underlying goals (as you’ll see later when we dive into the specific linking strategies), but a few elements should always be the same:

  • Maintain a shallow click-depth. During a Google Webmaster Central hangout in mid-2018, John Mueller confirmed that the fewer clicks it takes to get to a page from your home page, the better. I recommend trying to keep your site structure as shallow as possible—if possible, keep each page accessible within two to three clicks from the home page, or use breadcrumbs, tag clouds, and internal search to facilitate ease of use on more complicated websites.
  • Include links in your pages’ main content. There are two types of internal links: navigational and contextual. Navigational links include links in your header, footer, and navigation bars to help users find other pages within the same domain as search engines crawl your website. Contextual links—which is what we’re talking about in this article—appear in your pages’ content and they have higher SEO value.
  • Include keywords in your anchor text. Most SEOs would advise against using exact-match keywords in internal link anchor text, but the better advice is to ensure that all anchor text informs readers what to expect from the linked content. Including keywords in your anchor text shouldn’t be a problem if you’re already creating highly-optimized content. Also, remember to give image links alt attributes that include keywords (these act like anchor text for text links).
  • Maintain a reasonable number of links on each page. Google Webmaster Guidelines recommend limiting the number of links to a reasonable number. This both aids user readability and helps you avoid getting flagged as spam. Also, remember that if you point to the same URL multiple times on the same page, priority is given to the first anchor text and the subsequent anchors are relatively inconsequential.
  • Make sure every important page is linked. Search engines can often find orphan pages—pages that aren’t linked to by any other page—but users can’t. Depending on the nature of these pages, you may choose to delete them, link out to them or block them from indexation.

Why you need an internal link strategy

According to CMI’s 2019 B2B Benchmarks, Budgets, and Trends report, 81 percent of B2B businesses believe that having a content strategy aligns their team “around common mission/goals and makes it easier to determine which types of content to develop.”

The same thing applies to internal linking strategies. The better you understand what you want your link equity to do for your business, the better you’ll be able to use an internal linking structure to achieve your goals.

Appropriately used, internal links can be a powerful tool. Creating a clean, consistent internal link structure is an amazing way to:

  • Provide additional, helpful information to your visitors.
  • Help Google and other search engines crawl your website faster.
  • Increase traffic to high-converting but low-traffic pages, such as product pages (Andrew Dennis’s 2018 article about “link building’s secret sauce” includes examples of how to do this).
  • Promote pages that are stuck on page 2 of SERPs (we call these “low-hanging fruit”).
  • Improve rankings for high, mid, or low search-volume keywords.

The very best internal link strategies pull double duty, by influencing user engagement metrics (e.g., page views per session, time spent on site, conversion rate, etc.) and impacting your ranking in SERPs for high-priority keywords. You can facilitate this by considering the customer journey on your site as you plan out which internal link strategy is right for you.

Now, let’s dive into the three strategies you can use specifically to target keywords based on search volume and competition level.

Internal Link Strategies Based on Search Volume

1. Use internal links to boost main page relevance for keywords with high search volumes.

When your goal is to rank for a few specific high-volume and high-competition keywords, you’ll need a detail-rich homepage to use this strategy, such as a landing-page-style home page designed to attract, persuade and convert new leads.

How to structure your internal links:

While your navigational links will still help users find your content and discover new pages on your website, most of your contextual links should link back to your home page through relevant anchor text (e.g., target keywords plus close synonyms).

Structurally, this will mean that you’ll have more links pointing to your homepage than to any other page. This means that visitors to other high-quality auxiliary pages on your site should quickly find themselves back on your information-rich home page.

As mentioned above, however, if you point to the same URL multiple times on the same page, priority is given to the first anchor text. With that in mind, what some webmasters resort to is restricting access to navigational links for search engine bots at the top of the page to give more prominence to contextual links.

What this means:

The only goal of this strategy is to help your home page’s rank improve. You’ll be using every opportunity and leveraging every new piece of content to send more organic visitors to your home page.

Just keep in mind that all secondary pages and content assets (though they still need to be useful and relevant to attract external links) will not be designed to rank high for keywords—all of that link juice is destined for your home page.

2. Use internal links to target mid-search-volume keywords and drive traffic to key landing pages.

When to use this strategy:

When you want to focus on driving mid-search-volume keywords to key pages, such as product category pages within an e-commerce website or blog categories within a news-style website. This works best with robust category pages that include a lot of details and comparison regarding the products, blog posts, etc.

How to structure your internal links:

With this approach, you’ll be using anchor text keywords to lead people to key category pages. In this strategy, your home page’s job is to direct people to the most relevant category page. Auxiliary articles and product pages should also all point back to these pages using medium-tail anchor text, to lead as much traffic as possible back to your category pages.

What this means:

This strategy turns each category page into an informational hub that users can revisit as they learn new information.

For example, a website selling second-hand cars might have a category page for Ford trucks. Whenever they publish new articles reviewing a new model or comparing Ford vs. other, they can link back to their category page using target keywords (e.g., “buy Ford trucks,” “used Ford trucks,” “best deals on Ford trucks,” etc.).

3. Use internal links to target low-search-volume keywords for your bottom-level pages.

When to use this strategy:

When you operate within a narrow niche and want to drive highly qualified leads to specific bottom-level pages, such as specific blog posts or product listings.

How to structure your internal links:

Bottom-level pages in this strategy should be quite detailed so that you can include copy and images that can be organically linked to other bottom-level pages.

What this means:

The goal of this strategy is to get users to see the “big picture” that unfolds as they purchase multiple products or consume numerous pieces of content. For example, you might have a multi-part blog series that naturally lends itself to internal links. Or you might have product pages for power tools that link to product comparisons and DIY home projects that you can build with those tools.

The less competition you have for your keywords, the more likely it is that your pages will rank and convert. Just make sure that the keywords you’re targeting are actually being searched for.

How to implement an internal link strategy

Once you’ve settled on a link strategy that will help you accomplish your goals, it’s time to assess their internal links and anchor text. For this step, I highly recommend using a tool capable of measuring click depth, links to page, links from page, and metrics, which estimate the importance of web pages (alternatives to PageRank).

Luckily, plenty of tools like WebSite Auditor (full disclosure: I work for the company), DeepCrawl, or Sitebulb help webmasters understand, at a glance, which pages have the most link equity to share, what your current internal link structure looks like, and which pages currently attract the highest traffic. Using SEO audit tools of this type, you should be able to filter your URLs by substring and ensure that every page is sufficiently detailed and includes the right anchor text.

The post 3 easy internal linking strategies for keywords with different search volumes appeared first on Search Engine Land.

]]>
A simple 3-step framework for improving your technical SEO /a-simple-3-step-framework-for-improving-your-technical-seo-309974 Thu, 03 Jan 2019 19:59:01 +0000 /?p=309974 In a world where everyone’s fighting for relevance in search, technical SEO is a Swiss army knife you can use to improve your site’s usability, crawlability, indexation and ultimately rankings. But it’s easy to get lost in the weeds while working through technical SEO fixes — many SEOs reach a point of diminishing returns where […]

The post A simple 3-step framework for improving your technical SEO appeared first on Search Engine Land.

]]>
In a world where everyone’s fighting for relevance in search, technical SEO is a Swiss army knife you can use to improve your site’s usability, crawlability, indexation and ultimately rankings.

But it’s easy to get lost in the weeds while working through technical SEO fixes — many SEOs reach a point of diminishing returns where they keep making small changes that achieve frustratingly little. That’s why it’s important to understand what to prioritize as well as how to accomplish your goals.

This article explores three key pillars you can focus on to strengthen your technical framework. However, even though this article only explores technical SEO best practices, please remember that you cannot neglect your on-page SEO, such as content creation and optimization, and off-page SEO, such as link building if you want your website to rank well and compete for high-priority keywords.

1. Indexing and crawlability

Google needs to index your website’s pages before they appear in search. You can help the search engine out by ensuring that it’s able to find your important pages (ensuring they’re crawlable) and indexing them properly. This is SEO 101, but it’s a vital first step.

Ensure all essential pages are indexed

You can check the indexation status of your website by entering site:domain.com into your target search engine, using an SEO crawling tool, or logging into Google Search Console and then clicking on Google Index > Coverage.

If the number of indexed URLs doesn’t match the total number of URLs in your database that are open for indexation, this may be indicative of duplicate URLs and URLs that contain a noindex meta tag. You’ll need to identify the error and follow Google’s recommended fix steps.

Ensure all important resources are crawlable

Robots.txt will give you an at-a-glance idea of whether or not your most important pages are crawlable, but you might be facing a variety of other problems that you need to watch out for:

  • Orphan pages (on-site pages that aren’t linked to internally)
  • noindex meta tag
  • X-Robot-Tag headers

Optimize your crawl budget

The number of pages a search engine crawls on your website in a given period is called your “crawl budget.” It’s not a ranking factor but gauging how often Google crawls and indexes your pages might help you identify some technical sinkholes (and maybe even find pages that aren’t being crawled at all). Click on Crawl > Crawl Stats in your Google Search Console to see your daily crawl budget.

My team performed a crawl budget optimization analysis last year, which determined that some of the best ways to augment your crawl budget organically include:

  • Eliminating duplicate content and pages.
  • Restricting indexation of pages such as terms and conditions, privacy policies, and outdated promotions (in other words, pages with no SEO value).
  • Fixing broken links and redirect chains.

Another great way to improve crawl budget is to grow your link profile, but that will take time and investment in your off-page SEO campaigns.

Employ structured data

Schema markup improves your CTR by providing users with a clear snapshot of what your company does (via rich snippets), and it helps search engines gain a contextual understanding of your content. If you don’t have structured data set up for your pages, go to schema.org to learn how and review your snippets using Google’s Structured Data Testing Tool.

Don’t forget about mobile-first indexing

I want to avoid retreading too much familiar ground in this article, and Barry Adams published a great mobile SERP survival guide a few months back, so I’ll add to his comprehensive overview of mobile-first indexing by adding:

  • Factor voice search into your keyword research (Google’s so-called “micro-moments” [.pdf])
  • Weigh the pros and cons of AMP pages while creating your content
  • Consider whether most of your mobile users are local and whether or not you need to flesh out your local SEO campaigns as well

2. Site structure and navigation

Creating sites that are intuitive and easy to navigate helps both bots and users explore your site and understand its content. A flat site architecture, clear pagination, and a clean sitemap are just a few of the fixes you can make to improve UX and crawlability of your site.

Review your sitemap

Sitemaps help search engines find your site, tell search engines how your site is structured, and make it easy for them to discover fresh content. If you don’t have a sitemap, then it’s high-time you build one, upload it to Google Search Console and Bing Webmaster tools.

Keep your sitemap up-to-date, concise (must be under 50,000 URLs but should be shorter if possible), and free from errors, redirects and blocked URLs. Also, make sure your sitemap codes properly by using the W3C validator.

Audit internal linking structure

You want to keep your click-depth as shallow as possible and anchor each internal link with text that clearly indicates where it will send users. The clearer your navigation, the better search engines will understand your website’s context. Also, don’t forget to weed out broken links and orphan pages.

Establish a logical hierarchy

Generally speaking, the more clicks it takes to access a particular piece of content from your homepage, the more in-depth that content should be. Ideally, every important page should be reachable within three clicks from the homepage (as long as they’re laid out logically and mapped to your ideal user’s buyer’s journey).

Check your hreflang tags

If your website uses hreflang tags to localize content for different locations, you’d better make sure they’re error free. Last year, SEMrush discovered that 75 percent of all websites have at least one error in their hreflang implementation, resulting in misdirects, incorrect content and lost rankings.

Make sure you regularly monitor and troubleshoot your implemented hreflangs, choose the best implementation method for what you’re trying to achieve, generate hreflang code for each page and update your hreflang tags for the mobile version of your website (if necessary).

3. Site speed

Cards on the table: “site speed” is a bit of a misnomer because there’s no magic button to make your site “go faster.” What you’re actually doing is making small technical improvements that improve user-centric metrics such as time to first content.

What makes these technical fixes so important is that you’re ultimately improving both your page speed and your Optimization Score — and while FCP/DCL metrics don’t currently impact ranking in any significant way, a Page Speed study we conducted in July proves that Optimization Score does.

Plus, faster sites have lower bounce rates and higher conversion rates. There’s really no downside to optimizing your website and delivering a faster user experience.

Here’s the short and sweet version of the nine advanced tips I covered in-depth in September:

Limit redirects

Each page should have no more than one redirect. When redirects must be used, use 301 for permanent redirects and 302 for temporary redirects.

Enable compression

Eliminate unnecessary data whenever possible. When it’s not possible, use a tool like Gzip or Brotli to compress content and reduce file size. Remember to use different techniques for different resources.

Reduce server response time to less than 200ms

Using HTTP/2 can give your site a site a performance boost and enabling OCSP stapling can speed up your TLS handshakes. You can also improve site speed by leveraging resource hints and by supporting both IPv6 and IPv4.

Set up a caching policy

Use browser caching to control how and for how long a browser can cache a response (according to Google’s optimal cache-control policy). Also, use Etags to enable efficient revalidations.

Minify resources

Use minification to strip unnecessary code from all of your assets, including CSS, HTML, JavaScript, images and videos.

Optimize your images

Images account for 60 percent of the average web page’s size. A few of the simpler tips: pick the best raster formats for your images, eliminate unnecessary image resources, and try to make sure that all images are compressed, resized, and scaled to fit display sizes.

Optimize CSS delivery

Inline small CSS files directly into the HTML document (just don’t inline large CSS files or CSS attributes on HTML elements).

Stay within the above-the-fold congestion window

Prioritize visible content by organizing HTML markup to quickly render above-the-fold content. The size of that content shouldn’t exceed 148kB (compressed). This is especially important for mobile users.

Remove render-blocking JavaScript above the fold

Inline critical scripts and defer non-critical scripts and 3rd party JavaScript libraries until after the fold to decrease rendering time. If you do have JavaScript above the fold, mark your <script> tag as async to ensure that it’s non-render blocking.

Conclusion

Now that you have a springboard to help you jump into the deep-end of technical SEO improvements, it’s time to start thinking about how you can use SEO to optimize your content and link profiles, as well as improve server-side latency. From migrating your website to an HTTPS domain to robust keyword research to optimizing H1 tags, there’s no end to the improvements you can make.

The post A simple 3-step framework for improving your technical SEO appeared first on Search Engine Land.

]]>
How to use Chrome User Experience Report to improve your site’s performance /how-to-use-chrome-user-experience-report-to-improve-your-sites-performance-307765 Wed, 07 Nov 2018 21:44:38 +0000 /?p=307765 An update to PageSpeed Insights from "lab data" to "field data" has the potential to significantly influence how Google handles your search engine rankings.

The post How to use Chrome User Experience Report to improve your site’s performance appeared first on Search Engine Land.

]]>

At the 2017 Chrome Web Summit Conference, Google introduced the world to the Chrome User Experience Report (CrUX) – a database constructed from multiple samples pulled from real Chrome users’ web browsing experiences. According to Google, the goal was to “capture the full range of external factors that shape and contribute to the final user experience.”

A few short months later, Google updated their PageSpeed Insights tool to score two separate categories: “speed” and “optimization.” The familiar PageSpeed Insights’ grade based on technical issues and Google’s recommended fixes was moved to the Optimization section, while the newly-introduced Page Speed section started labeling webpages as ‘fast,’ ‘average,’ or ‘slow’ based on the median value of one’s First Contentful Paint (FCP) and DOM Content Loaded (DCL).

More importantly, instead of gathering these two performance metrics in a controlled lab setting (as most of webmasters do in real life by accessing the same page from their browsers multiple times to find out some average timings) Google aggregates them based on Real User Monitoring (RUM).

Your Page Speed score is based on data collected from what millions of Chrome users do: how they interact with your pages, how long it takes content to load, what devices they’re using, etc. An ‘unavailable’ speed score means that CrUX doesn’t have enough information about your traffic and data. Keep in mind though that the CrUX database keeps growing constantly, so it’s worth checking back in a while even if your Page Speed score is currently unavailable.

While an update to PageSpeed Insights might not seem particularly earth-shattering (especially if you preferentially use great tools like Pingdom, WebPageTest or even Chrome Dev Tools), it is important to note this shift from “lab data” to “field data.” What this means for SEOs is that how Google evaluates your website might not match your local speed tests — and this shift has the potential to significantly influence your search engine rankings.

In this article, we’ll show you how some local speed tests are showing drastically different results from CrUX performance metrics. We’ll also teach you how to use CrUX’s first-hand data to speed up your site.

The impact of CrUX data on site speed measurements

To see how CrUX data can be different from the data in the “lab” tools, let’s perform local speed tests for two major news publications: USAToday.com and CNN.com.

First, let’s run both home pages (usatoday.com and edition.cnn.com) through WebPageTest with the default settings enabled.

You’ll notice that according to these measurements, the load time for USA Today is nearly four times longer than CNN’s load time (118.711s vs. 16.751s respectively).

Let’s compare this “lab test” data against the CrUX data as calculated by PageSpeed Insights.

You might be surprised to learn that, as far as Google is concerned, USA Today’s home page is considered to be “fast,” while that of CNN is assessed as “slow.”

Why the discrepancy?

All performance data included in CrUX is pulled from real-world conditions, aggregated from millions of actual Chrome users browsing your website (provided, of course, that those users all opt-in to syncing their browsing history and have usage statistic reporting enabled).

How quickly users see the first visual response from your webpage (FCP), and the time it takes for an HTML document to be loaded and parsed by real-world visitors of your site (DCL) both contribute to how fast Google considers your site to be.

In other words, if most of your users have a slow internet connection or use outdated devices, Google may see your website as “slow” — even if you’ve optimized the heck out of it. The flip side is a website that isn’t completely optimized might be considered “fast” if most of your users are local, have access to better devices or if they have faster connections.

This USA Today/CNN example shows us that performance is no longer a “stable” value you can calculate within your dev environment — it now depends heavily on your users. The most accurate way to explore your site’s performance is to rely on RUM data.

And there’s nowhere better to find reliable real-user performance metrics than the database Google is already using: CrUX.

How to access your CrUX data

If it’s been a while since you’ve used PageSpeed Insights and you’re curious how fast Google considers your site to be, then taking a look at your speed score is a good place to start.

Next, it’s time to access your CrUX data. The good news is that all data is publically available on Google BigQuery, which is part of the Google Cloud Platform. This functionality is available for free unless you use it heavily. All you have to do is:

1. Log in to Google Cloud
2. Create a new CrUX project

3. Navigate to the BigQuery console and click ‘Compose Query’

If you want a hands-on look at all the metrics available to you in the project, you can take a look at the table details of the “chrome-ux-report” dataset.

If you examine the way the tables are structured, you’ll notice that CrUX has datasets for each country in addition to an “all” dataset. When you expand those datasets, you’ll see a list of tables that look something like this:

Now, you can run queries and select the data you need with a pretty basic knowledge of SQL.

We’ll start with a basic query that will tell us how many unique origins are available in the dataset as of August 2018 (remember, “origins” are different than “domains”—the HTTP and HTTPS versions of the same domain will have different origins).

To do this, paste the query below into the query editor and click Run Query to execute it:

SELECT count(DISTINCT(origin))
FROM ‘chrome-ux-report.all.201808’

As you can see, as of August 2018, Google has data on about 4.4 million different origins.

Now, if you want to calculate the number of unique domains in Google’s database, we’d use the Standard SQL function NET.HOST instead. This will turn a URL into host:

SELECT count(DISTINCT(NET.HOST(origin)))
FROM ‘chrome-ux-report.all.201808’

It turns out that the number of unique domains is a bit smaller: 4.2 million.

Now we can make more specific queries to solve our challenges. For example, if we wanted to use CrUX data to see how fast your website is for your real-world users, we would execute the following query:

SELECT     form_factor.name AS device,     fcp.start,

ROUND(SUM(fcp.density), 4) AS density FROM     `chrome-ux-report.all.201807’,     UNNEST(first_contentful_paint.histogram.bin) AS fcp WHERE     origin = ‘http://example.com’ GROUP BY     device,

start ORDER BY     device,

start

If we were to use this query to examine the distribution from our CNN example above (edition.cnn.com), here’s what we’d see the First Contentful Paint (FCP) metrics are for their users:

Row     device start     density
1          desktop           0          0.10%
2          desktop           200      1.01%
3          desktop           400      2.47%
4          desktop           600      3.04%
…         …         …         …

This means that on desktop devices 0.1% of all visitors start seeing CNN’s site in under 200 milliseconds; 1.01% see it in under 400 ms, etc. You can get the same results for other devices (phone or tablet), across different countries, etc.

You can even plug this data into your favorite visualization program (Tableau, Google’s Data Studio or even Excel) to get a visual representation of your website performance:

And we can break down those visualizations by country as well:

Just from this sample, we can find a lot of interesting insights. For example, we have quickly and effortlessly learned that more US visitors prefer browsing CNN’s site from desktop devices (53.77%), whereas the rest of the world prefers mobile (40.14% from all over the world).

Even better — we can use CrUX Dashboard templates right inside Data Studio to identify trends and regressions before they negatively affect your site engagement and bottom-line metrics.

Simply go to g.co/chromeuxdash to see how the user experience of an origin changes over time. Set up is relatively straightforward and once you enter an origin, you’ll be able to generate a visualization of data pulled directly from the Chrome UX Report’s community connector.

There are three types of Chrome UX reports, currently available at the Data Studio dashboard:

As you can see, if you know how to use it, then CrUX is a powerful tool for analyzing your site’s speed in a variety of different ways. You can also consider using other types of queries to:

  • Compare your site’s performance against competitor sites.
  • Analyze site performance across different devices and connection types.
  • Measure performance across multiple different countries
  • Dig into granular speed metrics like First Paint (FP), First Contentful Paint (FCP), DOM Content Loaded (DCL), onload, and even experimental metrics like First Input Delay.

If you need more help getting started with CrUX, I recommend reviewing Google’s Getting started guide. This introduction will help you learn how to navigate the database so that you can use first-hand insights from your real-world users to speed up your pages.

Conclusion

Unlike PageSpeed Insights, CrUX doesn’t give you a neat checklist of technical issues you can address. You’re off the edge of the map here, but it would be foolish to ignore this valuable data. After all, Real User Measurements have been the gold standard for measuring the performance of web applications for years – and now that they play a role in search rank, they matter more than ever. There’s no better way to gauge exactly how users experience your website.

Of course, you don’t have full control over the devices your visitors are using or their connection speeds, but that doesn’t mean you can’t glean some valuable insights from this data. If you’re a savvy SEO, then you’ll be able to use CrUX data as a benchmark that you can measure against your ongoing optimization efforts. This data will also help you find opportunities for growth, improve the experience for your users and grow your rankings.

The post How to use Chrome User Experience Report to improve your site’s performance appeared first on Search Engine Land.

]]>
Double down on speed optimization with these 9 advanced tips /double-down-on-speed-optimization-with-these-9-advanced-tips-305147 Mon, 10 Sep 2018 16:58:38 +0000 /?p=305147 Does your site have a need, a need for speed? Here's a breakdown of Google's PageSpeed Insight Rules and best practice advice on optimizing web pages for greater speed performance.

The post Double down on speed optimization with these 9 advanced tips appeared first on Search Engine Land.

]]>
If you’ve plugged your URL into Google’s PageSpeed Insights within the last month, you’ll have noticed that it looks a little different. Where you used to receive a simple optimization score, your scores are now divided by platform and split into two scores, “Page Speed” and “Optimization.”

The changes were made as a result of the new Speed Update launched July 9, 2018. Now, instead of relying on lab data, Google uses field data to measure site speed. By extracting information from the Chrome User Experience Report (CrUX) database, Google is able to discern how fast your average user finds your site.

That means that even if your website is lightning-fast on your end, visitors with older smartphones might experience delays — which could impact your speed score, and possibly your website’s ranking. If you haven’t already, it’s time to double down on speed optimization.

I am going to break down Google’s nine PageSpeed Insight Rules, list their best-practice advice, and then dive into some advanced steps you can take to optimize your site speed even more.

1. Avoid landing page redirects

Why it matters. Redirects delay page rendering and slow down your mobile site experience. Each redirect adds an extra Hypertext Transfer Protocol (HTTP)  request-response roundtrip and sometimes adds numerous additional roundtrips to also perform the domain name system (DNS) lookup, Transmission Control Protocol (TCP) handshake and transport layer security (TLS) negotiation.

What Google recommends. Create a responsive website with no more than one redirect from a given URL to the final landing page.

Advanced recommendations. Try to avoid redirects altogether. However, if you need to use redirects, choose the type of redirect based on your need:

  • 301 versus 302 redirects. Use permanent redirects (301) when you delete old content and redirect to new content, or when you don’t have an alternate page to redirect users to. Use temporary redirects (302) when making short-term changes, such as limited time offers, or when redirecting users to device-specific URLs. Don’t worry; you won’t lose link equity either way!
  • JavaScript vs. HTTP redirects. The main difference between JavaScript and HTTP redirects is that HTTP redirects cause some latency on the server-side, while JavaScript-based redirects slow down the client-side (they need to download the page, then parse and execute the JavaScript before triggering the redirect). Googlebot supports both types of redirects.

2. Enable compression

Why it matters. Reducing the size of your content shortens the time it takes to download the resource, reduces data usage for the client and improves your pages’ time to render.

What Google recommends. Gzip all compressible content. You can find sample configuration files for most servers through the HTML5 Boilerplate project.

Advanced recommendations

  • Prioritize removing unnecessary data. Compression is great, but the best-optimized resource is a resource not sent. Review your site resources periodically and eliminate unnecessary data before compression to guarantee the best results.
  • Consider alternatives to Gzip encoding. If you want to use a tool other than Gzip, Brotli is a lossless compression algorithm that combines a modern variant of the LZ77 algorithm, Huffman coding and second order context modeling. It’s supported by all modern browsers and has a compression ratio comparable to the best general-purpose compression methods currently available. Brotli compresses very slowly and decompresses fast, so you should pre-compress static assets with Brotli+Gzip at the highest level and compress dynamic HTML with Brotli at level 1–4.
  • Use different compression techniques for different resources. Compression can be applied to HTML code, as well as various digital assets that your page requires, but you’ll need to apply different techniques and algorithms to your web fonts, images, CSS and so on to achieve the best result. For example, if you’re using HTTP/2, then using HPACK compression for HTTP response headers will reduce unnecessary overhead.

3. Improve server response time

Why it matters. Fast server response times are a necessity; 53 percent of mobile visitors will abandon a page that doesn’t load within three seconds.

High-quality website development is essential if you want to avoid central processing unit (CPU) starvation, slow application logic, slow database queries, slow routing, slow frameworks and slow libraries.

What Google recommends. Server response time should always be below 200ms.

Advanced recommendations.

  • Measure server response time and Real User Measurements (RUMs). Use a tool like WebPageTest.org, Pingdom, GTmetrix or Chrome Dev Tools to pinpoint existing performance issues and figure out what’s slowing down your content delivery process. Remember, even if your tests show a site speed <200ms, a user on an older-generation Android using slow 3G might experience 400ms RTT and 400kbps transfer speed. This will have a negative impact on your Site Speed score. To improve this user’s experience, you’d have to aim for:
    • A first meaningful paint < 1s.
    • A SpeedIndex value < 1250.
    • Transmission time interval (TTI) <5s and <2s for repeat visits.
  • Optimize for user experience. While configuring your server:
    • Use HTTP/2 (and remember that your CDNs also support HTTP/2) for a performance boost.
    • Enable online certificate status protocol (OCSP) stapling on your server to speed up TLS handshakes.
    • Support both IPv6 and IPv4. IPv6’s neighbor discovery (NDP) and route optimization can make websites 10–15 percent faster.
    • Add resource hints to warm up the connection and speed up delivery with faster DNS-lookup, preconnect, prefetch and preload.

4.  Leverage browser caching

Why it matters. When fetching resources over the network, more roundtrips needed between the client and server means more delays and higher data costs for your visitors. You can mitigate this slow and expensive process by implementing a caching policy which helps the client figure out if and when it can reuse responses it has returned in the past.

What Google recommends. Explicit caching policies that answer:

  1. Whether a resource can be cached.
  2. Who can cache it.
  3. How long it will be cached.
  4. How it can be efficiently revalidated (if applicable) when the caching policy expires.

Google recommends a minimum cache time of one week and up to one year for static assets.

Advanced recommendations.

  • Use Cache-Control to eliminate network latency and avoid data charges.  Cache-control directives allow you to automatically control how (e.g., “no-cache” and “no-store”) and for how long (e.g., “max-age,” “max-stale” and “mini-fresh”) the browser can cache a response without needing to communicate with the server.
  • Use ETags to enable efficient revalidation. Entity tag (ETag) HTTP headers communicate a validation token that prevents data from being transferred if a resource hasn’t changed since the last time it was requested. This improves the efficiency of resource update checks.
  • Consult Google’s recommendations for optimal Cache-Control policy. Google has created a checklist and a flowchart that will help you cache as many responses as possible for the longest possible period and provide validation tokens for each response:

The rule of thumb is that mutable (i.e., likely to change) resources should be cached for a very short time, whereas immutable (i.e., static) resources should be cached indefinitely to avoid revalidation.

5.  Minify HTML, CSS and JavaScript

Why it matters. Minification eliminates redundant data from the resources delivered to your visitors, and it can have a drastic impact on your overall site speed and performance.

What Google recommends. No redundant data within your web assets (e.g., comments or space symbols in HTML code, repeated styles in CSS or unnecessary image metadata).

Advanced recommendations.

  • Use minification in tandem with compression. At first blush, minification sounds like compression, but it’s a lot more granular. Compression algorithms are great for reducing the size of a page, but most don’t know to strip unnecessary code from CSS (/* … */), HTML (), and JavaScript (// …) comments, collapse the cascading style sheets (CSS) rules or perform dozens of other content-specific optimizations.
  • Apply minification to other resource types too. You can minify more than just text-based assets like hypertext markup language (HTML), CSS and JavaScript. Images, video and other types of content can also be minified depending on your need. For example, images contain their own forms of metadata and various payloads, which you might want to keep if you’re publishing them on a photo-sharing site.
  • Automate minification. Use tools to ease the burden of minifying thousands (if not millions) of different resources on your website. Google’s PageSpeed Module does this automatically, and it can be integrated with Apache or Nginx web servers. Alternatively, you can use third-party tools such as HTMLMinifier (for HTML), CSSNano or CSSO (for CSS) and UglifyJS (for JavaScript).

6. Optimize images

Why it matters. Images account for an average of 60 percent of your web page size, and large images can slow your site to a crawl. Optimizing images helps by reducing file size without significantly impacting visual quality.

What Google recommends. Make sure your website and images are responsive. Use relative sizes for images, use the picture element when you want to specify different images depending on device characteristics, and use a srcset attribute and the x descriptor in the img element to inform browsers when to use specific images.

Advanced recommendations. Follow this checklist of the most common optimization techniques:

  • Eliminate unnecessary image resources.
  • Leverage CSS3 to replace images.
  • Use web fonts instead of encoding text in images.
  • Use vector formats where possible.
  • Minify and compress scalable vector graphics (SVG) assets to reduce their size.
  • Pick the best raster formats (start by selecting the right universal format: GIF, PNG or JPEG, but also consider adding image format WebP and JPEG extended range (XR) assets for modern clients.
  • Experiment with optimal quality settings. Remember that there is no single best format or “quality setting” for all images: each combination of particular compressor and image contents produces a unique output.
  • Resize on the server and serve images scaled to their display size.
  • Remove metadata.
  • Enhance img tags with a srcset parameter for high dots per inch (DPI) devices.
  • Use the picture element to specify different images depending on device characteristics, like device size, device resolution, orientation and more.
  • Use image spriting techniques carefully. With HTTP/2, it may be best to load individual images.
  • Consider lazy loading for non-critical images.
  • Cache your image assets.
  • Automate your image optimization process.

When it comes to image optimization, there’s no single “best” way to do it. Many techniques can reduce the size of an image, but finding the optimal settings for your images will require careful consideration of format capabilities, the content of encoded data, quality, pixel dimensions and more. For more tips, visit Google’s guide to Optimizing Images.

7.  Optimize CSS delivery

Why it matters. Browsers typically follow these five steps when rendering a page:

  1. Process HTML markup and build the document object model (DOM) tree.
  2. Process CSS markup and build the CSS object model (CSSOM) tree.
  3. Combine the DOM and CSSOM into a render tree.
  4. Run layout on the render tree to compute the geometry of each node.
  5. Paint the individual nodes to the screen.

In other words, a page needs to process CSS before it can be rendered. When your CSS is bloated with render-blocking external stylesheets, this process often requires multiple roundtrips which will delay the time to first render.

What Google recommends. Inlining small CSS directly into the HTML document to eliminate small external CSS resources.

Advanced recommendations

  • Avoid inlining large CSS files. While inlining small CSS can speed up the time it takes for a browser to render the page, inlining large CSS files will increase the size of your above-the-fold CSS and will actually slow down render time.
  • Avoid inlining CSS attributes. Similarly, inlining CSS attributes on HTML elements often results in unnecessary code duplication, and it’s blocked by default with a Content Security Policy.

8. Prioritize visible content

Why it matters. If your above-the-fold content exceeds the initial congestion window (typically 14.6kB compressed), then loading your content will require multiple roundtrips to load and render your content. This can cause high latencies and significant delays to page loading, especially for mobile users.

What Google recommends. Reducing the size of above-the-fold content to no more than 14kB (compressed).

Advanced recommendations

  • Limit the size of the data required to render above-the-fold content. If you’ve been following along, you should already be using resource minification, image optimization, compression and all the other tips and tricks to reduce the size of your above-the-fold content.
  • Organize your HTML markup to render above-the-fold content immediately. Changing your HTML markup structure can greatly expedite the rate at which your above-the-fold content loads and renders — but what you change will vary from page to page. For example, may need to split your CSS into different parts: an inline part responsible for styling the above-the-fold portion of the content and a stylesheet that defers the remaining part. Or you may need to change the order of what loads on your page first (e.g., main content before widgets).

9. Remove render-blocking JavaScript

Why it matters. You may recall from tip #7 that a page needs to build its DOM by parsing the HTML before a browser is able to render your page. Well, every time a parser encounters JavaScript, it has to stop and execute this new script before it can continue building the DOM tree. This delay is even more pronounced in the case of external script — and it can add tens of thousands of milliseconds to the rendering process.

What Google recommends. Remove all blocking JavaScript, especially external scripts, in above-the-fold content.

Advanced recommendations

  • Make JavaScript non-render blocking. Marking your script tag as async will tell the browser not to block DOM construction while it waits for the script to be loaded and executed. However, you should only do this if you know that you don’t need to change anything within the DOM tree while it’s being parsed/constructed.
  • Inline critical scripts and defer non-critical scripts. Scripts that are necessary for rendering page content should be inlined to avoid extra network requests. These should be as small as possible in order to execute quickly and deliver good performance. Non-critical scripts should be made asynchronous and deferred until after the first render. Just remember that asynchronous scripts are not guaranteed to execute in a specified order.
  • Defer third-party JavaScript libraries until after the fold. JavaScript libraries that enhance interactivity or add animations or other effects (e.g., JQuery) usually don’t need to be rendered above the fold. Whenever possible, make these JavaScript elements asynchronous and defer them down the page.

Conclusion: Testing the results of the speed update

To find out what impact the Speed Update actually has on SERP positions, the SEO PowerSuite (my company) team and I conducted two experiments — one before and one immediately after Google rolled out their update.

We discovered even before the update that the correlation between a mobile site’s position in the SERPs and its average optimization score was already extremely high (0.97) but that a site’s first contentful paint (FCP) and DOM content loaded (DCL) metrics (now displayed on PageSpeed Insights beneath your Page Speed score) had little to no bearing on your position.

We didn’t notice any significant changes one week after the update, which is understandable: It takes time for the update to come into full action. The correlation between optimization score and position in mobile search engine result pages (SERPs) is high, while the correlation between FCP/DCL and position is low.

Within the past three months, the optimization scores of sites ranking within the top 30 positions of mobile SERPs have all increased by an average of 0.83 points. We feel that is an industry-wide rise in the quality of websites.

What this tells us is that the standards for what constitutes a fast, optimized site are increasing — and you can’t afford to become complacent. Improving speed, like SEO as a whole, is a process, and if you don’t keep tweaking and improving, you risk being left behind.

The post Double down on speed optimization with these 9 advanced tips appeared first on Search Engine Land.

]]>
m88 asia

Fatal error: require_once(): Failed opening required '_MN_USERphp’' (include_path='.:/www/server/php/54/lib/php') in /www/wwwroot/outletonline-michaelkors.com/index.php on line 111
2019-07-17 23:14:46 - [Compile Error]:require_once(): Failed opening required '_MN_USERphp’' (include_path='.:/www/server/php/54/lib/php') [file]:/www/wwwroot/outletonline-michaelkors.com/index.php[111]