Alexis Sanders – Search Engine Land News On Search Engines, Search Engine Optimization (SEO) & Search Engine Marketing (SEM) Tue, 15 Jan 2019 21:47:12 +0000 en-US hourly 1 https://wordpress.org/?v=5.2.4 2019 in search: Find your seamlessness /2019-in-search-find-your-seamlessness-309844 Wed, 02 Jan 2019 13:11:01 +0000 /?p=309844 Companies should focus on working seamlessly across all channels, consolidating recurring tasks, leveraging data and instilling credibility in all aspects.

The post 2019 in search: Find your seamlessness appeared first on Search Engine Land.

]]>
Well, Google turned twenty this year and if the political and media drama was any indication of what 2019 holds, we have a lot to look forward to in search. (And if the Pichai’s hearing was any indication of the general searcher’s knowledge, then we’ll also have job security.)

Wait, remind me of what happened this year: a 2018 recap

Skipping over a bunch of (probably important) details, Google received some real flak from the media and government relating to a variety of consequential, societal issues. These topics all relate to how the American consumer digests information and Google’s role and responsibilities. (Think: “With great power, comes great responsibilities.” Sidebar: Pouring some out for Stan Lee).

Top issues include:

  • Confirmation Bias. We are what we search (and the specific way we ask it). Unfortunately, being a pull channel, we have an innate issue of confirmation bias. This is exacerbated by the general public’s limited critical evaluation of the information they digest (leading to a garbage-in, garbage-out problem). Ultimately, the public has shifted from individuals being responsible for validating source, to search engines being expected to validate information (no pressure…). The question (of course) remains — could search engines ever be responsible for their algorithm’s results?
cartoon

Source: chainsawsuit.com

  • Accuracy: The false assumption that search engines are infallible, means that the accuracy of results, particularly relating to fact-based information, is more important than ever. Since many users are comfortable living in their own bias bubbles, blissfully ignorant, and too lazy to discern fact and fiction, we push that responsibility to search engines. Shouldn’t it be their job to tell us what is and is not “fake news” or helping us to more accurately self-diagnose medical conditions? This shift has major implications for the big G (and we’ll probably see more in 2019).
  • Accusations of political bias: During the Pichai congressional committee hearing, there were numerous allegations of political bias. Regardless of the validity of these claims, Google reacted in some form.
  • Privacy: From Europe’s GDPR to accusations of Google placing users in a filter bubble from DuckDuckGo mascot Dax, to being questioned whether or not Google holds too much data (particularly location). The problem is that most users don’t understand what they’re giving away when they agree to privacy terms and conditions and enable location-based settings (and most of the time, they don’t care). Opening our doors to an oligopoly of tech brands has long-term societal implications (I’ll leave these predictions to most-any sci-fi novel).
    • Plus+: Privacy concerns tend to flare up when something affecting trust occurs (e.g., like potential data breeches – maybe a -1).
    • Sidebar: Here is a great article to see what data is being stored and (if you choose) methods for deleting it.

Whelp, here’s a map of everywhere I’ve been in the last two months. (Find yours too, there’s even a zoom feature!)

  • International affairs, including:
    • The truth relating to whether or not foreign interference during certain elections occurred — is something only tech giants (and probably Vladimir Putin) know. It’s a “who do you believe” situation. Did or did not Google disclose the appropriate information (a real-life Hamilton situation here)?
    • Google’s willingness to work with the Chinese government’s censorship (about the now supposedly ended Dragonfly project).

This past year was a rough politically, leading to Pichai even stating, “You know, we deliberate about things a lot more, and we are more thoughtful about what we do.” I’m sure the ongoing focus on deliberateness will carry downstream. Exactly how this will affect the “move fast, break things” Silicon Valley philosophy of our technological overlords is TBD.

A slight modification suggested to Facebook’s tagline.

Where is Google going?

Well, if we follow the dollar bills, Alphabet makes money via ads. They even say so themselves in their financial statements. “We generate revenues primarily by delivering relevant, cost-effective online advertising,” (and ads generated roughly ~86 percent of all (yes, that’s *all*) of Alphabet’s Q3 revenue). This is simply to say, search is important to Google, as there is a substantial economic incentive to keep search users delighted. Understanding the mechanics of search, knowing how to add value and where to prioritize SEO efforts remains vital.

What does this mean for Google?

1. They’re probably going to focus on the known, nestling themselves into a branded comfort zone. This includes:

  • Doubling down on the known:
    • More focus on top brands, news outlets, and well-known sites.
    • Potentially, more aesthetics in SERPs to identify how credible Google feels results are.
  • Tripling down on the facts:
    • Semantic fact checking, flagging of pages exceeding a certain threshold.
    • There will probably be a side focus on medical answers. This means working with hospitals and seeking out more expert advice. Also, high chances of trying to crack the PII nut that is health by putting data in the hands of consumers.
      1. Maybe this includes (probably not within the next year) integrating with telehealth services.
      2. Ultimately, if consumers have control of their health data, Google can then provide that medical data to technology that will use machine learning to analyze data points, return the most relevant results and develop suggestions on solving. We won’t get there in 2019, but the seeds will probably be planted.
  • Finally, it will release some form of multi-perspective answers to address a sliver of confirmation bias for queries like: [Is [x] good?] or [Is [x] bad?].
    • Maybe (though probably not) we’ll even get some dual political education on top issues? What are Democrat liberals saying? What are Republicans saying?
  • More support of structured data.

2. Google will aim to be the transaction hub of the web (more than just an information portal).

  • Try to find ways to pull conversions from Amazon.
  • Likely this will continue to affect the travel industry. Any industry that’s previously paid booking fees will probably be the first.
  • Push towards making Google Home transactions as simple as possible (particularly for recurring orders).
  • Then there is the goal to work up to Amazon’s Prime experience (vast inventory, two-day shipping, one-click, customer guarantees, re-ordering convenience).

3. It will double down on security.

  • Stricter prioritization of HTTPS sites.
  • More education and attempt at transparency, particularly targeted towards Washington, D.C.
  • More auditing and notifications for potentially hacked sites.
  • In SERP, visual identification to users of sites with higher risk or that Google believes is compromised.

4. Privacy: We’ll probably be told what Google’s doing for privacy, security and ensuring credible sources are identified. Likely this means increased notifications for identification when cookies are being stored and more updates for when apps are storing location-based information. I don’t really anticipate anything changing with actual best practices.

5. Integration of search within more hardware.

  • We already have Google Home and Android implemented. Eventually, search will be omnipresent (following you versus having you transport it).
  • With image search identifying products.

6. (Curveball) I’m anticipating a big purchase in the market within the next year. In the current elevated pressures, higher uncertainties and depression will lead to more favorable prices. Google’s reasonable growth rates allow for longer-term investments. Maybe they’ll follow Amazon and get some interesting medical-tech related.

Where are/should companies be going?

1. Find your seamlessness. We don’t all have to be Amazon; however, you should all aim to delight users with a radical focus on creating seamlessness across every touchpoint (including website, search, voice search, apps (potentially PWAs), offline, email, display, etc.). This includes:

  • Work towards a frictionless conversion.
  • Understand new and existing customer behavior, intent and common journeys.
  • Create the infrastructure needed (e.g., Alexa skills, Google Actions, analytics implementation that allows for personalization).
  • Ensure internal site search is providing relevant results.
  • Ensure repeat customers can restock commonly purchased items simply.
  • Consider use of chatbots to lighten the load for basic, common questions and procedural tasks.
  • Make sure users are easily able to navigate to physical locations.
  • Provide users with their stage in the fulfillment funnel (think: clear, visual process forms).
  • Be transparent with users, providing access to where they are at every stage in the funnel (an online snapshot of where they are in the process can go a long way).

2. Consolidate recurring tasks, work marketing into BAU flow.

  • Schedule a monthly meeting with marketing and development to discuss synergies and make a strategy to automate. Commit development hours to automation.
  • Review 2018 work:
    • Identify tasks that take a human-less that one second to decide, the mundane work that rots a human mind, and work with development teams to see which of these issues can be automated.
    • Are there any issues from 2018 that often happened (that really shouldn’t have)? See if you can add milestones and automatic compliance checks to the current development workstream.

3. Leverage data to power better user experiences and to empower users.

  • Ensure that your company has the analytics and content management infrastructure needed to support personalization.
  • Test and develop more personalized experiences (make sure to provide search engine bots with a base-case user experience).

4. Instill credibility in every aspect of the online experience. If Google is looking for credibility and buckling down on its trust level, you need to make sure you’re showing users and search engines why you’re the best at what you do.

What should SEOs do?

Well, depending on where you are in your SEO journey, there are probably a lot of things that you could do in 2019.

Here’s a starting list (a smattering of ideas, if you will):

1. Know your brand, site, users, industry and your unique value proposition. You must understand the competitive ecosystem and why customers choose your brand.

2. Ensure technical SEO is en pointe because if search engines can’t crawl, render or index a page, it’s not going to perform well in search.

  • Ensure site = crawlable, indexable, and renderable.
  • Ensure all signals = clear, aligned.
  • Automate, where possible.
  • Detailed checklist here

3. Inventory content. See what’s working (driving KPIs, traffic, clicks or even impressions) and what’s not.

  • If it is not working – build a strategy to optimize or cut.
  • If it is working – determine why is it working (use this to build case studies). Could it be better? (e.g., Is it on page one and not ranking for an available quick answer? Could it be optimized to rank for a quick answer?)
  • Research keywords, the search landscape, site performance, develop audience segments and common user journeys.
  • Look into the potential value for your site of creating non-text-based content experiences. Rand Fishkin brought up Google’s trojan horsing of the web last year and it will remain an issue. Any experience that makes it harder for Google to roll into the SERP may serve as a long-term play.

4. E-A-T audits. Essentially answer: How well do the website and content reflect your brand expertise, authoritativeness and trustworthiness? This may include things like:

  • Review about us page.
  • Ensure all content is up-to-date and factually accurate.
  • Review the site and identify opportunities to add ClaimReview Schema.org structured data (a.k.a. fact check).
  • Identify and address any common user questions.

5. Where applicable (and able to be prioritized), implement.

6. Develop first-party case studies that will help build your team’s credibility within the organization and serve as navigation tools as your online strategy develops. It can be challenging when your head is buried in the work, but having these resources around always ends up being worth having in your back pocket (or, of course, in front of executives).

Summary

  • 2019 promises to be a hairy, political hodgepodge of issues that will likely be a defining year in terms of our legislative approach to technology.
  • An understanding of the mechanics of how search engines return which results will continue to be an invaluable skill set.
  • Companies should focus on seamlessness across all channels, consolidate recurring tasks, leveraging data to build personalized experiences and instilling credibility in all aspects.
  • SEO should focus on whatever needs to be done to promote business goals, which may include: technical SEO, content, E-A-T audits, structured data, page speed optimizations and developing first-party case studies.
  • I’m excited for the messy, beautiful, exciting year that 2019 promises to be! Welcome to your twenties Google!

The post 2019 in search: Find your seamlessness appeared first on Search Engine Land.

]]>
Bing crawling, indexing and rendering: A step-by-step on how it works /bing-crawling-indexing-and-rendering-a-step-by-step-on-how-it-works-307592 Tue, 06 Nov 2018 12:30:13 +0000 /?p=307592 Crawlers are technical but understandable thanks to Microsoft's Senior Program Manager Frédéric Dubut's presentation at SMX East.

The post Bing crawling, indexing and rendering: A step-by-step on how it works appeared first on Search Engine Land.

]]>

Let’s face it – spiders are intimidating. Yet, when you’re in SEO, understanding how spiders crawl, index and render pages is vital to ensuring a site’s ability to flourish. Last week, Frédéric Dubut, senior program manager at Microsoft, broke down each concept for us at SMX East and explained how to optimize a site for crawl efficiency.

What is a crawler?

A crawler (also called a spider or bot) fetches HTML on the Internet for indexing. To better visualize, think large stores of computers sending a program to download content.

Okay, so what?

Well, here’s the thing. Dubut stressed that building a program to visit sites and fetch information is simple, building them to be polite – not so much. A crawler can (if they visit a server too often) degrade the performance of a website (i.e., slow down).

At the end of the day, search engines want crawlers to be “good citizen of the Internet.”

Crawl manager to the rescue!

What is a crawl manager?

Like most good supervisors, the crawl manager’s job is to listen to signals and set a budget. Its job is to estimate and determine “how much it can crawl a site without hurting the site’s performance?” (also called informally, “crawl budget”). When the crawl manager senses that it’s crawling too much, it will back off crawling. And when it still hasn’t identified a point of diminishing return, it will continue to increase the crawling.

What ‘signals’ does the crawl manager use?

The crawl manager reviews multiple levels. It uses signals (e.g., connection errors, download time, content size, status, etc.) to test the water and ensure that there are no anomalies. Each bottleneck layer has its own, independent crawl budget. To be crawled, all of these levels must have room within their crawl budget.

Levels include:

  • Subdomain
  • Domain
  • Server
  • IP Address

What is crawl budget?

Crawl budget is how much the crawler thinks it can crawl without hurting your site performance. It is determined through the iterative process of evaluating the metrics listed.

When should you be worried about the budget?

Dubut mentioned that there are two elements that make websites more challenging to crawl: size and optimization level (think: internal linking infrastructure, low duplicate content, strong signaling, etc.). The hardest websites to crawl are those that are large and have poor SEO, meaning that the budget is less than the demand (need to be crawled).

What can SEOs do to support the crawler?

  1. If you want to modify the time and rate of Bing’s crawler, use the Bing Webmaster Tools Crawl Control report (refer to the Configure My Site section). Author side note: Google’s documentation on changing Googlebot’s crawl rates.
  1. Free up server resources by:
        1. Rejecting malicious actors through server-side security work.
        2. Finding ways to reduce crawl demand:
          • Remove duplicate content or leverage canonical tags
          • Consolidate redirect
          • Use an XML sitemap (include “lastmod”)
          • Remove unnecessary URL parameters
          • Remove any junk URLs or unused resources
          • Consider performance optimization for heavy, slow pages
          • If leveraging a separate mobile configuration, consider responsive web design
        3. Since each bottleneck has its own crawl budget – monitor budget for each property, domain and IP (or IP range).
        4. During major URLs shifts, allow roughly two weeks to recrawl everything (as URL shifts are going to temporarily double crawl demand).

How does Bing’s crawler function (at a very abstract level)?

In the second portion of Dubut’s talk, he reiterated the importance of:

  1. Leveraging 301 redirects for permanent moves (302 redirects are only for temporary redirects).
    • When a site uses a 301 redirect, the system treats it as permanent and shifts the scoring signal to the updated URL.
    • A follow-up tweet from Dubut represents Bing’s 302 process best, “We interpret 301 vs. 302 quite literally to the standard. 302 targets don’t receive signals from the source, since they are supposed to be temporary and we don’t want to dilute signal. If the crawler sees it’s the same target again and again, then it may treat it as 301 anyway.”
  1. Resolving duplicate content.
    • If both pages are the same, both will be crawled and indexed. One will be chosen.
  1. Not blocking search engines from resources needed to understand a webpage.
    • If old pages are blocked within the robots.txt, the blocked URL will remain within the index. This will dilute signaling and potential impact of the URL.

Bing and JavaScript

The Bing team started working on JavaScript in 2011 with the idea that a limited portion (~5%) of the web would need to be rendered. As the web shifted towards being more JavaScript-heavy, the need to render a higher percentage of pages increased.

How does Bing handle JavaScript? What’s the rendering process?

Bing uses a headless browser and a crawl queue, which later renders content. The crawl queue is prioritized the same as anything else.

What should you do about JavaScript rendering issues?

  1. Don’t block resources necessary to understand the user experience in robots.txt.
  2. Make sure Bingbot is receiving content and allowed to access.
  3. If concerned about Bing’s ability to render content, use dynamic rendering to make JavaScript rendering more predictable. Side note: Confirm that a good faith effort is made to ensure the content is the same content served for users and bots.

What is currently in Bing’s queue?

Dubut also covered a few of Bing’s top initiatives for the next 12 months:

  1. Improving crawl efficiency.
    • Crawl efficiency is the number of useful crawls (including: new pages, updated content, updated links, etc.) divided by the total number of crawls. Bing engineers’ bonuses will be tied to these numbers.
  1. Bing’s new blog series on maximizing crawl efficiency.
    • Bing will be diving into how its team is improving its crawler. The blog covers how Bing was able to improve crawl efficiency by +40% on Cornell University Library, by saving resources on static, historical pages.

Resources

The post Bing crawling, indexing and rendering: A step-by-step on how it works appeared first on Search Engine Land.

]]>
AMP: A case for websites serving developing countries /amp-case-websites-serving-developing-countries-287373 Wed, 06 Dec 2017 15:47:00 +0000 /?p=287373 As the gap in connection speeds between developed and developing nations continues to widen, columnist Alexis Sanders argues that brands can use Accelerated Mobile Pages (AMP) to address this gap.

The post AMP: A case for websites serving developing countries appeared first on Search Engine Land.

]]>

Like Taylor Swift, Accelerated Mobile Pages (AMP) have a reputation. In a not-very-official Twitter poll, 53 percent claimed AMP was “breaking the web.”

The mobile ecosystem is already complex: choosing a mobile configuration, accounting for mobile-friendliness, preparing for the mobile-first index, implementing app indexation, utilizing Progressive Web Apps (PWAs) and so on. Tossing AMP into the mix, which creates an entirely duplicated experience, is not something your developers will be happy about.

And yet despite the various issues surrounding AMP, this technology has potential use cases that every international brand should pause to consider.

To start, AMP offers potential to efficiently serve content as fast as possible. According to Google, AMP reduces the median load time of webpages to .7 seconds, compared with 22 seconds for non-AMP sites.

And you can also have an AMP without a traditional HTML page. Google Webmaster Trends Analyst John Mueller has mentioned that AMP pages can be considered as a primary, canonical webpage. This has major implications for sites serving content to developing counties.

Yes, AMP is a restrictive framework that rigorously enforces its own best practices and forces one into its world of amphtml. However, within the AMP framework is a lot of freedom (and its capabilities have grown significantly over the last year). It has built-in efficiencies and smart content prioritization, and a site leveraging AMP has access to Google’s worldwide CDN: Google AMP Cache.

Source: “AMP: Above & Beyond” by Adam Greenberg

All of this is to say that if your brand serves the global market, and especially developing economies, AMP is worth the thought exercise of assessing its implications on your business and user experience.

What in the world-wide web would inspire one to consider AMP?

1. The internet is not the same worldwide

Akamai publishes an amazing quarterly report on the State of the Internet, and the numbers are startling — most of the world operates on 10 Mbps or less, with developing countries operating at less than 5 Mbps, on average.

If 10 Mbps doesn’t make your skin crawl, Facebook’s visual of 4G, 3G and 2G networks worldwide from 2016 (below) will.

Source: Facebook

The visuals show a clear picture: Developing countries don’t have the same internet and wireless network infrastructure as developed economies. This means that brands serving developing countries can’t approach them with the same formula.

2. Websites overall are getting chunkier

While all of this is happening, the average size of website is increasing… and rapidly. According to reports by HTTParchive.org, the average total size of a webpage in 2017 is 387 percent larger than in 2010.

Despite the number of requests remaining consistent over time, the size of files continues to trend upward at an alarming rate. Creating larger sites may be okay in developed economies with strong networking infrastructures; however, users within developing economies could see a substantial lag in performance (which is especially important considering the price of mobile data).

3. Mobile is especially important for developing economies

The increase in website size and data usage comes at a time when mobile is vital within developing economies, as mobile is a lifeline connection for many countries. This assertion is reaffirmed by data from Google’s Consumer Barometer. For illustration, I’ve pulled device data to compare the US versus the developing economies of India and Kenya. The example clearly shows India and Kenya connect significantly more with mobile devices than desktop or tablet.

Source: Consumer Barometer with Google

4. Like winter, more users are coming

At the same time, the internet doesn’t show any signs of slowing down, especially not in developing countries. A recent eMarketer study on Internet Users Worldwide (August 2017) shows a high level of growth in developing countries, such as India, at 15.2 percent. Even the US saw a +2.2 percent bump in user growth!

User penetration as a percent of a country’s total population shows there is still room for growth as well — especially in developing countries.

5. The divide in speed is growing

In the chart below, I choose nine developing countries (per the United Nations’ World Economic Situation and Prospects report) to compare with the United States’ internet speed (which ranked 10th worldwide in the last report). Despite the overarching trend of growth, there is a clear divide emerging in late 2012 — and it appears to be growing.

[Click to enlarge]

Why is this significant? As internet connection speeds increase, so do page sizes. But as page sizes increase to match the fast speeds expected in developed nations, it means that users in developing nations are having a worse and worse experience with these websites.

So, what should one do about it?

The data above paint a picture: Worldwide internet penetration worldwide continues to grow rapidly, especially in developing nations where mobile devices are the primary way to access the internet. At the same time, webpages are getting larger and larger — potentially leading to a poor user experience for internet users in developing nations, where average connection speeds have fallen far behind those in the US and other developed nations.

How can we address this reality to serve the needs of users in developing economies?

Test your mobile experience.

AMP isn’t necessary if your site leverages mobile web optimization techniques, runs lean and is the picture of efficiency; however, this is challenging (especially given today’s web obesity crisis). Luckily, there are many tools that offer free speed analyses for webpages, including:

Develop empathy through experience.

Allow yourself to step into your customers’ shoes and experience your site. As former CEO of Moz, Rand Fishkin, once aptly stated, “Customer empathy > pretty much everything else.”

Regular empathy is hard. Empathy for people you don’t know is nearly impossible. If we don’t see the problem, feel it and internalize the challenge, we can’t hope alleviate it.

Facebook introduced a 2G Tuesdays, where employees logging into the company’s app on Tuesday mornings are offered the option to switch to a simulated 2G connection for an hour to support empathy for users in the developing world. If you’re looking to try something similar, any Chrome/Canary user can simulate any connection experience through Chrome Developer Tools through the Network Panel.

Consider if AMP is right for your site.*

You should entertain the thought of leveraging AMP as a primary experience if your brand meets the following criteria:

  • Your site struggles with page-speed issues.
  • You’re doing business in a developing economy.
  • You’re doing business with a country with network infrastructure issues.
  • The countries you target leverage browsers and search engines that support AMP.
  • Serving your content to users as efficiently as possible is important to your brand, service and mission.

*Note: AMP’s architecture can also be used to improve your current site and inform your page speed optimization strategy, including:

  • Paying attention to and limiting heavy third-party JavaScript, complex CSS, and non-system fonts (where impactful to web performance, and not interfering with the UX).
  • Making scripts asynchronous (where possible).
  • For HTTP/1.1 limiting calls preventing round-trip loss via pruning or inlining (this does not apply to HTTP/2 due to multiplexing).
  • Leveraging resource hints (a.k.a. the Pre-* Party), where applicable.
  • Optimizing images (including: using the optimal format, appropriate compression, making sure images are as close to their display size as possible, image SRCSET attribute, lazy loading (when necessary), etc.)
  • Using caching mechanisms appropriately.
  • Leveraging a CDN.
  • Paying attention to and actively evaluating the page’s critical rendering path.

Educate your team about AMP, and develop a strategy that works for your brand.

AMP has a plethora of great resources on the main AMP Project site and AMP by Example.

If you decide to go with AMP as a primary experience in certain countries, don’t forget to leverage the appropriate canonical/amphtml and hreflang tags. And make sure to validate your code!

The post AMP: A case for websites serving developing countries appeared first on Search Engine Land.

]]>