backlinks.com
More
White Label SEO
seo in dubai
https://relativityseo.com/seo-services/ Barry Schwartz – Search Engine Land News On Search Engines, Search Engine Optimization (SEO) & Search Engine Marketing (SEM) Mon, 13 Jul 2020 13:26:32 +0000 en-US hourly 1 https://wordpress.org/?v=5.4.2 Video: Purna Virji of Microsoft on inclusion and accessibility in search /video-purna-virji-of-microsoft-on-inclusion-and-accessibility-in-search-337504 Mon, 13 Jul 2020 13:26:18 +0000 /?p=337504 Purna also won the US Search Personality Award in 2019 and is just a really good person.

The post Video: Purna Virji of Microsoft on inclusion and accessibility in search appeared first on Search Engine Land.

]]>
In the next interview I conducted at SMX West in early 2020, I sat down with the Senior Manager of Global Engagement at Microsoft Advertising, Purna Virji. Purna is one of the most loved and respected individuals in our industry and that is not just my opinion, she won the US Search Personality Award in 2019. She has a rich history in the SEM industry and speaking with her for several minutes was a lot of fun.

The first topic we discussed was inclusion and accessibility in search marketing. Purna explained how thinking of inclusion and accessibility and making changes to your search campaigns and websites can lead to new revenue and customer acquisition opportunities. It also is just a good thing to do, the right thing to do.

The next topic was on PPC automation and how using a metric called return on ad spend (ROAS) can help you with that automation. Purna also gave some PPC tips such as ad customizers, and more.

Purna Virji can be followed on Twitter @purnavirji.

Here is the video:

I started this vlog series recently, and if you want to sign up to be interviewed, you can fill out this form on Search Engine Roundtable. You can also subscribe to my YouTube channel by clicking here.

The post Video: Purna Virji of Microsoft on inclusion and accessibility in search appeared first on Search Engine Land.

]]>
Google Images adds more facts about images with the Knowledge Graph /google-images-adds-more-facts-about-images-with-the-knowledge-graph-337342 Wed, 08 Jul 2020 16:00:40 +0000 /?p=337342 If you notice more traffic to your site from Google Image Search, this may be why.

The post Google Images adds more facts about images with the Knowledge Graph appeared first on Search Engine Land.

]]>
Google announced it has launched a new feature within Google Image search on mobile to show “quick facts about what you see on Google Images.” Google added under the image preview window knowledge panel details that you can open and expand to show more information about the image.

Knowledge panel expanders. Google will show drop down menus under the image preview that shows more details about the image. Google said “when you search for an image on mobile in the U.S., you might see information from the Knowledge Graph related to the result. That information would include people, places or things related to the image from the Knowledge Graph’s database of billions of facts, helping you explore the topic more.”

What it looks like. Here is a GIF of it in action from Google:

Testing. Google has been testing this for the past few weeks and now has officially launched this.

More details. Google said “to generate these links to relevant Knowledge Graph entities, we take what we understand about the image through deep learning, which evaluates an image’s visual and text signals, and combine it with Google’s understanding of the text on the image’s web page. This information helps us determine the most likely people, places or things relevant to a specific image. We match this with existing topics in the Knowledge Graph, and then surface them in Google Images when we’re confident we’ve found a match.”

Why we care. Google may end up showing content from Wikipedia or other sources on the internet. If you do see more traffic from Google Image search, it may be that Google might be showing your content in these knowledge panels and searchers are clicking on them to go to your web site.

How you optimize for them, would be how you optimize for any knowledge panel. That is for another article.

The post Google Images adds more facts about images with the Knowledge Graph appeared first on Search Engine Land.

]]>
Google Rich Results Test tool now out of beta /google-rich-results-test-tool-now-out-of-beta-337265 Tue, 07 Jul 2020 14:38:40 +0000 /?p=337265 Google is deprecating the old Structured Data Testing Tool.

The post Google Rich Results Test tool now out of beta appeared first on Search Engine Land.

]]>
Google announced that it has removed the beta label from the Rich Results Test tool. The tool now “fully supports all Google Search rich result features.”

Deprecating Structured Data Testing Tool. With that, Google said it will begin to deprecate the Structured Data Testing Tool. Google said the old Structured Data Testing Tool “will still be available for the time being.” It does plan to go away at some point in the future, so Google “strongly recommends” that you use the Rich Results Test to test and validate your structured data.

Some history. The Rich Results Test launched in December 2017 as an upgrade for the Structured Data Testing tool, which launched in 2015. The Structured Data Testing Tool is still available over here.

Rich results test. Google said these are some of the reasons to use the Rich Results Test over the Structured Data Testing tool:

  • It shows which Search feature enhancements are valid for the markup you are providing.
  • It handles dynamically loaded structured data markup more effectively.
  • It renders both mobile and desktop versions of a result.
  • It is fully aligned with Search Console reports.

You can learn more about this over here and the help document is over here.

Why we care. If you are using the old Structured Data Testing tool, you will need to prepare for that tool to go away. Get accustomed to the Rich Results test tool instead.

The post Google Rich Results Test tool now out of beta appeared first on Search Engine Land.

]]>
Video: Christi Olson of Microsoft on audience targeting, syncing with Google and Promote IQ /video-christi-olson-of-microsoft-on-audience-targeting-syncing-with-google-and-promote-iq-337227 Mon, 06 Jul 2020 13:38:07 +0000 /?p=337227 Christi Olson is the head of Evangelism for Search and Advertising at Microsoft.

The post Video: Christi Olson of Microsoft on audience targeting, syncing with Google and Promote IQ appeared first on Search Engine Land.

]]>
I had the privilege of sitting down for a few minutes at SMX West earlier this year with Christi Olson, the head of Evangelism for Search and Advertising at Microsoft. She has been with Microsoft on and off for a long time, since before it was called Bing — seeing the launch of Live Search, adCenter, and the transitions to Bing and Microsoft Advertising.

We talked about audience targeting, a topic she is passionate about. Audience targeting can help you segment your ads to target the right people at the right time. Did you know that you can sync and import your Google Ads campaign data into Microsoft Advertising? You can also sync up between Google Search Console and Bing Webmaster Tools. PromoteIQ, which Microsoft acquired in 2019, helps e-commerce retailers with their digital commerce strategy compete against the bigger online retailers.

Christi used to travel a lot before the pandemic — in fact, she typically spent a quarter of the year on the road. And she is a wife and mother. You can follow Christi at @ChristiJOlson on Twitter.

Here is the video:

I started this vlog series recently, and if you want to sign up to be interviewed, you can fill out this form on Search Engine Roundtable. You can also subscribe to my YouTube channel by clicking here.

The post Video: Christi Olson of Microsoft on audience targeting, syncing with Google and Promote IQ appeared first on Search Engine Land.

]]>
Regular expression filter support coming to Google Search Console performance reports /regular-expression-filter-support-coming-to-google-search-console-performance-reports-337224 Mon, 06 Jul 2020 13:37:03 +0000 /?p=337224 This new support is mentioned in the Google help documents but does not seem to be live yet.

The post Regular expression filter support coming to Google Search Console performance reports appeared first on Search Engine Land.

]]>
Google updated the help document for the performance report within Google Search Console to say you can use regular expressions to filter the report results.

Google wrote, “If you choose the Custom (regex) filter, you can filter by a regular expression (a wildcard match) for the selected item. You can use regular expression, or regex, filters for page URLs and user queries. The RE2 syntax is used.”

Regular expressions. Regex is often used by developers, but RE2 is the same regex syntax Google Analytics users are accustomed to using for querying data. Regex can be super powerful and fast in filtering data and even for replacing data, but it can also be tricky to get right. Regular expressions is a sequence of characters that define a search pattern. Usually such patterns are used by string searching algorithms for “find” or “find and replace” operations on strings, or for input validation.

More on what you can do. Google listed these bullets in the help document:

  • The default matching is “partial match”, which means that your regular expression can match anywhere in the target string unless you use ^ or $ to require matching from the start or end of the string, respectively.
  • Default matching is case-sensitive. You can specify “(?i)” at the beginning of your regular expression string for case-insensitive matches. Example: (?i)https
  • Invalid regular expression syntax will return no matches.
  • Regular expression matching is tricky; try out your expression on a live testing tool, or read the full RE2 syntax guide

Examples. Google also listed the more common ways to use this in the help document:

Coming soon. It is not clear when this will go live but I suspect when you try to filter by page or query and other of the performance filters, you will see the option to use regex there at some point. At the time we published this, Google removed all mentions of regex from the help document.

Why we care. This will give SEOs and developers an additional way to get out more data, the way they want to see it, from Search Console. You need to be careful when using regex, in that often if done wrong, the outcome will be wrong as well. At least in this case, this is just being used as a “find” command and not a “find and replace” command. So you cannot do damage, in the sense of removing your site from Google search with this. But you can do damage in that you can misinterpret your data if you use this the wrong way.

The post Regular expression filter support coming to Google Search Console performance reports appeared first on Search Engine Land.

]]>
Google updated guidelines to say spam reports are not for manual actions /google-updated-guidelines-to-say-spam-reports-are-not-for-manual-actions-337192 Fri, 03 Jul 2020 13:25:42 +0000 /?p=337192 Google only uses spam reports to improve its spam prevention algorithms.

The post Google updated guidelines to say spam reports are not for manual actions appeared first on Search Engine Land.

]]>
Gary Illyes of Google announced on the Google webmaster blog this morning that it wanted to clarify that spam reports are only to improve Google’s spam detection algorithms. Google removed any mention of manual actions in the Google webmaster guidelines and say these spam reports will not lead to a Google employee reviewing a site and manually penalizing it.

Automated spam prevention. The difference is that when you submit a spam report, this spam report will only be used by Google to figure out how to improve its search algorithms. You will not see a site you reported drop by itself in the Google search results directly because of a spam report. Google may use that information to update its algorithms but not manually penalize any specific site.

Manual actions. Manual actions are penalties Google employees can assign to individual sites or pages when the site violates Google’s webmaster guidelines. When you receive a manual action, that action will be shown within Google Search Console. Again, Google spam reports will not lead to a manual action.

Before. This is what the guidelines said prior to this update:

“If you believe that another site is abusing Google’s quality guidelines, please let us know by filing a spam report. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. While we may not take manual action in response to every report, spam reports are prioritized based on user impact, and in some cases may lead to complete removal of a spammy site from Google’s search results. Not all manual actions result in removal, however. Even in cases where we take action on a reported site, the effects of these actions may not be obvious.”

After. This is what the guidelines say now, after this update:

“If you believe that another site is abusing Google’s quality guidelines, please let us know by filing a spam report. Google prefers developing scalable and automated solutions to problems, and will use the report for further improving our spam detection systems.”

More. Gary Illyes from Google wrote “spam reports play a significant role: they help us understand where our automated spam detection systems may be missing coverage. Most of the time, it’s much more impactful for us to fix an underlying issue with our automated detection systems than it is to take manual action on a single URL or site.”

Why we care. This is Google being clear that when you submit spam reports, you should not expect immediate action or a manual action to be associated to the site you are reporting. It will take time for Google to improve their algorithms and for those algorithms to show an impact in the Google search results.

The post Google updated guidelines to say spam reports are not for manual actions appeared first on Search Engine Land.

]]>
WSJ: GoogleBot can add products to shopping carts /wsj-googlebot-can-add-products-to-shopping-carts-337098 Wed, 01 Jul 2020 22:41:51 +0000 /?p=337098 Make sure to double check those abandoned cart metrics; they may be inflated.

The post WSJ: GoogleBot can add products to shopping carts appeared first on Search Engine Land.

]]>
A Google crawler has been adding products to e-commerce site shopping carts, the Wall Street Journal reported Wednesday. Sellers have been complaining about a serial cart abandoner named, John Smith. Turns out John is a Google bot. A Google spokesperson told the Wall Street Journal that it built systems to ensure the pricing seen on the product pages is reflected when a user adds a product to the cart.

GoogleBot shopping. Google told Search Engine Land in a statement, “We use automated systems to ensure consumers are getting accurate pricing information from our merchants.”

Sellers that upload their product feeds to Google Merchant Center may not realize it, but they agree to having Google’s bots crawl their sites for price verifications when they agree to the Terms of Service. The bot is designed to ensure the price in the feed matches the price on the product page and when the product is added to the cart.

The automated system will disapprove items that don’t pass pricing verifications.

Abandoned carts. Google is aware that this may cause issues for merchants and owners of e-commerce sites. Google told the WSJ, “This sometimes leads to merchants seeing abandoned carts as a result of our system testing the price displayed matches the price at checkout.” That data can mess with e-commerce site owners’ abandoned cart metrics, making them look artificially higher than they really are.

Can you block it? Not if you want to participate in Google Shopping or show your products in Surfaces across Google. Google’s own terms of service for Google Merchant Center allow Google to crawl your site. It is possible, though, to control the rate of crawling via the site’s robots.txt file.

Why we care. Google said it is looking to clarify how these automated systems work with merchant web sites in the future to avoid confusion. E-commerce site owners, if you’ve noticed funky abandoned cart metrics, it may be a bot.

The post WSJ: GoogleBot can add products to shopping carts appeared first on Search Engine Land.

]]>
Bing’s search ranking factors; relevance, quality & credibility, user engagement, freshness, location and page load time /bings-search-ranking-factors-relevance-quality-credibility-user-engagement-freshness-location-and-page-load-time-336924 Tue, 30 Jun 2020 16:00:00 +0000 /?p=336924 Want to know how Bing ranks web pages? Here is how Bing describes it within its own guidelines.

The post Bing’s search ranking factors; relevance, quality & credibility, user engagement, freshness, location and page load time appeared first on Search Engine Land.

]]>
Bing’s newly updated Webmaster Guidelines documents how the search engines generally decides how to rank web pages in its search results. Bing breaks down how it ranks web pages based on relevance, quality & credibility, user engagement, freshness, location and page load time.

The guidelines explains that the search results are algorithmic and not done by hand. “Bing search results are generated by using an algorithm to match the search query a user enters into the search engine with content in our index,” Bing wrote. Bing is continually improving its algorithms, Bing wrote it “designs – and continually improves – its algorithms to provide the most comprehensive, relevant and useful collection of search results available.”

Caveat on these ranking factors. Before Bing lists out its ranking factors, Bing explained that ranking is complex and it uses many criteria to deliver search results. Bing wrote “please note that Bing’s complex ranking systems use many criteria to deliver search results, and the relative importance of each of the parameters described below may vary from search to search and may evolve over time.” Bing did however say that the ranking factors listed below “are listed in general order of importance.”

Relevance. Bing wrote “relevance refers to how closely the content on the landing page matches the intent behind the search query. This includes matching terms directly on the page as well as terms used in links referring to the page. Bing also considers semantic equivalents, including synonyms or abbreviations, which may not be exact matches of the query terms but are understood to have the same meaning.”

This paragraph does not reveal too much but it is good for them to state, for obvious reasons.

Quality & Credibility. Bing says in this section that Bing can use the author’s credibility or a site’s reputation. Bing specifically says it can determine “the quality and credibility of a website includes an evaluation of the page itself.” This includes “such factors as the author’s or site’s reputation.” The example given “an article with citations and references to data sources is considered higher quality than one that does not explain cite data sources.” In addition, this may have the opposite impact where “Bing may demote content that includes name-calling, offensive statements, or uses derogatory language to make a point), the completeness of the content, and transparency of authorship.”

Here Bing is saying that it can demote sites that do name-calling, write offensive statements or use derogatory language. It also looks to see if the content is complete and the authorship is transparent.

User engagement. While for years, Google says it does not look at user engagement factors, such as click throughs, time spent on site, and so on. Now Bing, Google’s competitor, says it does. Bing wrote “Bing also considers how users interact with search results.”

How does Bing do this? Bing says “to determine user engagement, Bing asks questions like:  Did users click through to search results for a given query, and if so, which results? Did users spend time on these search results they clicked through to or did they quickly return to Bing?  Did the user adjust or reformulate their query?” In fact, Bing Webmaster tools provides these analytics and insights into how users interact with your webpages.  Bing can use those insights for ranking purposes.

Freshness. Bing says it “prefers” content that is more fresh and has up-to-date information. But it depends on the content and the category. Bing wrote “Generally Bing prefers content that is more “fresh” – meaning that the page consistently provides up-to-date information.  In many cases, content produced today will still be relevant years from now. In some cases, however, content produced today will go out of date quickly.”

Location. A searchers location can influence what content is ranked. Bing wrote “ranking results Bing considers where the user is located (country and city), where the page is hosted, the language of the document, or the location of other visitors to the page.”

Page load time. Finally, Bing also said that “slow page load times can lead a visitor to leave your website, potentially before the content has even loaded, to seek information elsewhere.” It is because of this that “Bing may view this as a poor user experience and an unsatisfactory search result.” Bing, like Google, prefers “paster page loads”, but Bing added “webmasters should balance absolute page load speed with a positive, useful user experience.”

Why we care. It goes without saying that every SEO and marketers cares about how they can improve their site’s rankings in Bing and Google. Here are some details on how Bing ranks web pages.

The post Bing’s search ranking factors; relevance, quality & credibility, user engagement, freshness, location and page load time appeared first on Search Engine Land.

]]>
Bing updates its Webmaster Guidelines /bing-updates-its-webmaster-guidelines-336915 Tue, 30 Jun 2020 16:00:00 +0000 /?p=336915 These new Bing guidelines discuss the fundamental principles behind how Bing crawls, indexes and ranks content - make sure to review them.

The post Bing updates its Webmaster Guidelines appeared first on Search Engine Land.

]]>
Bing has vastly updated its Bing Webmaster Guidelines. The new updated guidelines is broken down into multiple sections including:

  • How Bing finds and indexes your site
  • Help Bing understand your pages
  • How Bing ranks your content
  • Abuse and examples of things to avoid

Previous guidelines. It has been a while since Bing updated their guidelines. Bing first published its webmaster guidelines in 2012. We have those guidelines archived in this screen capture over here.

Why update it. Bing updated the Bing Webmaster Guidelines to include the various updates it has made to search over the years. This includes updates to how Bing crawls, indexes, ranks web pages, in addition to how Bing handles search spam.

Updated information around the URL submission API, support of rel=”sponsored” and rel=”ugc”, how Bing indexes JavaScript, the evergreen BingBot, and much more is all discussed in this document.

“It was time to modernize and refresh the global Bing Webmaster Guidelines, providing insights on how Bing discovers, crawls, indexes and ranks content.” Fabrice Canel, Principal PM Bing Webmaster Tools. “When Fabrice and I speak at industry events, we receive a lot of in-depth questions around the specifics of Bing’s Webmaster Guidelines and how elements from discovery to ranking have changed since we’ve refreshed Bing’s Webmaster Tools. We decided to refresh the entire guidelines to make them easier to understand while including  the most recent updates on crawling, indexing, ranking and quality.”  Christi Olson, Head of Evangelism Search at Microsoft.

Why we care. Often, the search engine guidelines act as the fundamental principles behind how the search engine crawls, indexes and ranks content. You want to make sure to read those guidelines to understand how the search engine works, and ensure you do not take steps to lead your site to be penalized by the search engine.

You can check out the new Bing Webmaster Guidelines over here.

The post Bing updates its Webmaster Guidelines appeared first on Search Engine Land.

]]>
Bing supports rel=sponsored & rel=ugc /bing-supports-relsponsored-relugc-336921 Tue, 30 Jun 2020 16:00:00 +0000 /?p=336921 You can use the rel="nofollow" or rel="sponsored" or rel="ugc" attributes, to "prevent the links from being followed by a crawler and from potentially impacting search rankings.

The post Bing supports rel=sponsored & rel=ugc appeared first on Search Engine Land.

]]>
A new tidbit found in the updated Bing Webmaster Guidelines is that Bing now supports rel=”sponsored” and rel=”ugc” attributes on your links. This is in addition to Bing supporting the rel=”nofollow” link attribute, which we know it has supported since its introduction over ten years ago.

New link attributes. These two new link attributes were introduced by Google last September. These are how these attributes are handled. Bing wrote in the guidelines “make a reasonable effort to ensure that any paid or advertisement links on your site use rel=”nofollow” or rel=”sponsored” or rel=”ugc” attribute to prevent the links from being followed by a crawler and from potentially impacting search rankings.”

rel="sponsored": The new sponsored attribute can be used to identify links on your site that were created as part of advertisements, sponsorships or other compensation agreements.

rel="ugc": The ugc attribute value is recommended for links within user generated content, such as comments and forum posts.

A Bing spokesperson confirmed with Search Engine Land that these link attributes are treated as hints, not necessarily directives. So technically Bing can decide to still follow these links if it deemed necessary. The rel sponsored and rel ugc attributes are being monitored by Bing and are “not as strong of a signal for Bing,” we were told. This may change based on adoption rates over time.

Why we care. This means that if you do decide to add these attributes to your site, Bing will also take these into account. Bing’s guidelines says you can use all three, the rel=”nofollow” or rel=”sponsored” or rel=”ugc” attributes, to “prevent the links from being followed by a crawler and from potentially impacting search rankings.”

The post Bing supports rel=sponsored & rel=ugc appeared first on Search Engine Land.

]]>