Jayson DeMers – Search Engine Land News On Search Engines, Search Engine Optimization (SEO) & Search Engine Marketing (SEM) Thu, 29 Mar 2018 20:12:04 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.1 Has AI changed the SEO industry for better or worse? /changed-seo-industry-better-worse-285115 Wed, 08 Nov 2017 17:02:25 +0000 /?p=285115 Columnist Jayson DeMers explores the impact of Google's shift toward machine learning and discusses what the future will look like for search professionals.

The post Has AI changed the SEO industry for better or worse? appeared first on Search Engine Land.

]]>

With Google turning to artificial intelligence to power its flagship search engine business, has the SEO industry been left in the dust? The old ways of testing and measuring are becoming antiquated, and industry insiders are scrambling to understand something new — something which is more advanced than their backgrounds typically permit.

The fact is, even Google engineers are having a hard time explaining how Google works anymore. With this in mind, is artificial intelligence changing the SEO industry for better or worse? And has Google’s once-understood algorithm become a “runaway algorithm?”

Who was in the driver’s seat?

The old days of Google were much simpler times. Artificial intelligence may have existed back then, but it was used for very narrow issues, like spam filters on Gmail. Google engineers spent most of their time writing preprogrammed “rules” that worked to continuously close the loopholes in their search engine — loopholes that let brands, with the help of SEO professionals, take advantage of a static set of rules that could be identified and then exploited.

However, this struck at the heart of Google’s primary business model: the pay-per-click (PPC) ad business. The easier it was to rank “organically” (in Google’s natural, unpaid rankings), the fewer paid ads were sold. These two distinctly different parts of their search engine have been, and will always be, at odds with one another.

If you doubt that Google sees its primary business as selling ads on its search engine, you haven’t been watching Google over the past few decades. In fact, almost 20 years after it started, Google’s primary business was still PPC. In 2016, PPC revenues still represented 89 percent of its total revenues.

At first glance, it would stand to reason that Google should do everything it can to make its search results both user-friendly and maintainable. I want to focus on this last part — having a code base that is well documented enough (at least, internally within Google) so that it can be explained to the public, as a textbook of how websites should be structured and how professionals should interact with its search engine.

Going up the hill

Throughout the better part of Google’s history, the company has made efforts to ensure that brands and webmasters understood what was expected of them. In fact, they even had a liaison to the search engine optimization (SEO) world, and his name was Matt Cutts, the head of Google’s Webspam Team.

Cutts would go around the SEO conference circuit and often be the keynote or featured session speaker. Any time Google was changing its algorithms or pushing a new update to its search engine, Cutts would be there to explain what that meant for webmasters.

It was quite the spectacle. In one room, you typically had hundreds of SEOs who were attacking every loophole they could find, every slim advantage they could get their hands on. In the very same room, you had Cutts explaining why those techniques were not going to work in the future and what Google actually recommended.

As time when loopholes were closed, Cutts became one of the only sources of hope for SEOs. Google was becoming more sophisticated than ever, and with very few loopholes left to exploit, Cutts’s speaking engagements became crucial for SEOs to review and dissect.

The ‘uh-oh’ moment

And then, the faucet of information slowed to a trickle. Cutts’ speaking engagements became rarer, and his guidelines became more generic. Finally, in 2014, Cutts took a leave from Google. This was a shock to insiders who had built an entire revenue model off of selling access to this information.

Then, the worst news for SEOs: He was being replaced by an unnamed Googler. Why unnamed? Because the role of spokesperson was being phased out. No longer would Google be explaining what brands should be doing with each new update of its search engine.

The more convoluted its search engine algorithms were, the more PPC ads Google sold. As a result of this shift, Google capitalized immensely on PPC ad revenue. It even created “Learn with Google,” a gleaming classroom where SEO conference attendees could learn how to maximize PPC spend.

An article by Search Engine Land columnist Kristine Schachinger about the lack of information on a major algorithmic update, and Google’s flippant response by interim spokesman Gary Illyes, had all of the SEO industry’s frustration wrapped up in a nutshell. What was going on?

Removing the brakes — the switch to an AI-powered search engine

At the same time, Google was experimenting with new machine learning techniques to automate much of the updating process to its search engine. Google’s methodology has always been to automate as much of its technology as it could, and its core search engine was no different.

The pace of Google’s search engine switch to artificial intelligence caught many off-guard. This wasn’t like the 15 years of manual algorithm updates to its index. This felt like a tornado had swept in — and within a few years, it changed the landscape of SEO forever.

The rules were no longer in some blog or speech by Matt Cutts. Here stood a breathtaking question: Were the rules even written down at Google anymore?

Much of the search engine algorithms and their weightings were now controlled by a continuously updating machine-learning system that changed its weightings from one keyword to the next. Marcus Tober, CTO of SearchMetrics, said that “it’s very likely that even Google Engineers don’t know the exact composition of their highly complex algorithm.

The runaway algorithm

Remember Google’s primary revenue stream? PPC represents almost 90 percent of its business. Once you know that, the rest of the story makes sense.

Did Google know beforehand that the switch to an AI-powered search engine would lead to a system that couldn’t be directly explained? Was it a coincidence that Cutts left the spotlight in 2014, and that the position never really came back? Was it that Google didn’t want to explain things to brands anymore, or that they couldn’t?

By 2017, Google CEO Sundar Pichai began to comment publicly on Google’s foray into artificial intelligence. Bob Griffin, CEO of Ayasdi, wrote recently that Pichai made it clear that there should be no abdication of responsibility associated with intelligent technologies. In other words, there should be no excuse like “The machine did x.”

Griffin put it clearly:

Understanding what the machine is doing is paramount. Transparency is knowing what algorithm was used, which parameters were used in the algorithm and, even, why. Justification is an understanding of what it did, and why in a way that you can explain to a reporter, shareholder, congressional committee or regulator. The difference is material and goes beyond some vague promise of explainable AI.

But Google’s own search engineers were seemingly unable to explain how their own search engine worked anymore. This discrepancy had gotten so bad that in late 2017, Google hired longtime SEO journalist Danny Sullivan in an attempt to reestablish its image of transparency.

But why such a move away from transparency in the first place? Could it be that the move to artificial intelligence — something that went way over the heads of even the most experienced digital marketing executives, was the perfect cover? Was Google simply throwing its proverbial hands up in the air and saying, “It’s just too hard to explain?” Or was Google just caught up in the transition to AI, trying to find a way to explain things like Matt Cutts used to do?

Regardless of Sullivan’s hire, the true revenue drivers meant that this wasn’t a top priority. Google had solved some of the most challenging technical problems in history, and they could easily have attempted to define these new technical challenges for brands, but it simply wasn’t their focus.

And, not surprisingly, after a few years of silence, most of the old guard of SEO had accepted that the faucet of true transparent communication with Google was over, never to return again.

Everyone is an artificial intelligence expert

Most SEO experts’ backgrounds do not lend themselves very well to understanding this new type of Google search. Why? Most SEO professionals and digital marketing consultants have a marketing background, not a technical background.

When asked “How is AI changing Google?,” most answers from industry thought leaders have been generic. AI really hasn’t changed much. Effective SEO still requires the same strategies you’ve pursued in the past. In some cases, responses simply had nothing to do with AI in the first place.

Many SEO professionals, who know absolutely nothing about how AI works, have been quick to deflect any questions about it. And since very few in the industry had an AI background, the term “artificial intelligence” became almost something entirely different — just another marketing slogan, rather than an actual technology. And so some SEO and digital marketing companies even began pinning themselves as the new “Artificial Intelligence” solution.

The runaway truck ramp?

As with all industries, whenever there’s a huge shift in technology, there tends to be a changing of the guard. There are a number of highly trained engineers that are beginning to make the SEO industry their home, and these more technologically savvy folks are starting to speak out.

And, for every false claim of AI, there are new AI technologies that are starting to become mainstream. And these are not your typical SEO tools and rank trackers.

Competitive industries are now investing heavily in things like genetic algorithms, particle swarm optimization and new approaches that enable advanced SEO teams to model exactly what Google’s RankBrain is attempting to do in each search engine environment.

At the forefront of these technologies is industry veteran and Carnegie Mellon alumni Scott Stouffer, founder and CTO of MarketBrew.com, who chose to create and patent a statistical search engine modeling tool, based on AI technologies, rather than pursuing a position at Google.

Now, 11 years into building his company, Stouffer has said:

There are a number of reasons why search engine modeling technology, after all these years, is just now becoming so sought-after. For one, Google is now constantly changing its algorithms, from one search query to the next. It doesn’t take a rocket scientist to know that this doesn’t bode well for SEO tools that run off of a static set of pre-programmed rules.

On the flipside, these new search engine models can actually be used to identify what the changes are statistically, to learn the behavior and characteristics of each search engine environment. The models can then be used to review why your rankings shifted: was it on-page, off-page, or a mixture of both? Make an optimization on your site, and rerun the model. You can instantly see if that change will statistically be a positive or negative move.

I asked Stouffer to give me a concrete example. Let’s say you see a major shift in rankings for a particular search result. These search engine modeling tools start with what Stouffer coins as a “standard model.” (Think of this as a generic search engine that has been regression-tested to be a “best fit” with adjustable weightings for each algorithmic family.) This standard model is then run through a process called Particle Swarm Optimization, which locates a stable mixture of algorithmic weightings that will produce similar search results to the real thing.

Here’s the catch: If you do this before and after each algorithmic shift, you can measure the settings on the models between the two. Stouffer says the SEO teams that invest in Market Brew technology do this to determine what Google has done with its algorithm: For instance, did it put more emphasis on the title tags, backlinks, structured data and so on?

Suffice it to say, there are some really smart people in this industry who are quickly returning the runaway algorithm back to the road.

Chris Dreyer of Rankings.io put it best:

I envision SEO becoming far more technical than it is today.  If you think about it, in the beginning, it was super easy to rank well in search.  The tactics were extremely straight forward (i.e. keywords in a meta tag, any link placed anywhere from any other website helped, etc.). Fast forward just a decade and SEO has already become much more advanced because search algorithms have become more advanced.  As search engines move closer to the realistic human analysis of websites (and beyond), SEOs will have to adapt. We will have to understand how AI works in order to optimize sites to rank well.

As far as Google goes, the hiring of Sullivan should be a very interesting twist to follow. Will Google try to reconcile the highly technical nature of its new AI-based search engine, or will it be more of the same: generic information intended on keeping these new technologists at bay, and keeping Google’s top revenue source safe?

Can these new search engine modeling technologies usher in a new understanding of Google? Will the old guard of SEO embrace these new technologies, or is there a seismic shift underway, led by engineers and data scientists, not marketers?

The next decade will certainly be an interesting one for SEO.

The post Has AI changed the SEO industry for better or worse? appeared first on Search Engine Land.

]]>
Can we machine-learn Google’s machine-learning algorithm? /can-machine-learn-googles-machine-learning-algorithm-267229 Thu, 16 Feb 2017 17:46:17 +0000 http:/?p=267229 As Google becomes increasingly sophisticated in its methods for scoring and ranking web pages, it's more difficult for marketers to keep up with SEO best practices. Columnist Jayson DeMers explores what can be done to keep up in a world where machine learning rules the day.

The post Can we machine-learn Google’s machine-learning algorithm? appeared first on Search Engine Land.

]]>

Google’s rollout of artificial intelligence has many in the search engine optimization (SEO) industry dumbfounded. Optimization tactics that have worked for years are quickly becoming obsolete or changing.

Why is that? And is it possible to find a predictable optimization equation like in the old days? Here’s the inside scoop.

The old days of Google

Google’s pre-machine-learning search engine operated monolithically. That is to say, when changes came, they came wholesale. Large and abrupt movements, sometimes tectonic, were commonplace in the past.

What applied to one industry/search engine result applied to all results. This was not to say that every web page was affected by every algorithmic change. Each algorithm affected a specific type of web page. Moz’s algorithm change history page details the long history of Google’s algorithm updates and what types of sites and pages were impacted.

The SEO industry began with people deciphering these algorithm updates and determining which web pages they affected (and how). Businesses rose and fell on the backs of decisions made due to such insights, and those that were able to course-correct fast enough were the winners. Those that couldn’t learned a hard lesson.

These lessons turned into the “rules of the road” for everyone else, since there was always one constant truth: algorithmic penalties were the same for each vertical. If your competitor got killed doing something Google didn’t like, you’d be sure that as long as you didn’t commit the same mistake, you’d be OK. But recent evidence is beginning to show that this SEO idiom no longer holds. Machine learning has made these penalties specific to each keyword environment. SEO professionals no longer have a static set of rules they can play by.

Dr. Pete Meyers, Moz’s Marketing Scientist recently noted, “Google has come a long way in their journey from a heuristic-based approach to a machine learning approach, but where we’re at in 2016 is still a long way from human language comprehension. To really be effective as SEOs, we still need to understand how this machine thinks, and where it falls short of human behavior. If you want to do truly next-level keyword research, your approach can be more human, but your process should replicate the machine’s understanding as much as possible.”

Moz has put together guides and posts related to understanding Google’s latest artificial intelligence in its search engine as well as launched its newest tool, Keyword Explorer, which addresses these changes.

Google decouples ranking updates

Before I get into explaining how things went off the rails for SEOs, I first have to touch on how technology enabled Google’s search engine to get to its current state.

It has only been recently that Google has possessed the kind of computational power to begin to make “real-time” updates a reality. On June 18, 2010, Google revamped its indexing structure, dubbed “Caffeine,” which allowed Google to push updates to its search index quicker than ever before. Now, a website could publish new or updated content and see the updates almost immediately on Google. But how did this work?

Google - caffeine updates

Before the Caffeine update, Google operated like any other search engine. It crawled and indexed its data, then sent that indexed data through a massive web of SPAM filters and algorithms that determined its eventual ordering on Google’s search engine results pages.

After the Caffeine update, however, select fresh content could go through an abbreviated scoring process (temporarily) and go straight to the search results. Minor things, like an update to a page’s title tag or meta description tag, or a published article for an already “vetted” website, would be candidates for this new process.

Sounds great, right? As it turned out, this created a huge barrier to establishing correlation between what you changed on your website and how that change affected your ranking. The detaching of updates to its search results — and the eventual thorough algorithmic scoring process that followed — essentially tricked many SEOs into believing that certain optimizations had worked, when in fact they hadn’t.

google old index vs caffeine updates

Source: Google Official Blog

This was a precursor to the future Google, which would no longer operate in a serialized fashion. Google’s blog effectively spelled out the new Caffeine paradigm: “[E]very second Caffeine processes hundreds of thousands of pages in parallel.”

From an obfuscation point of view, Caffeine provided broad cover for Google’s core ranking signals. Only a meticulous SEO team, which carefully isolated each and every update, could now decipher which optimizations were responsible for specific ranking changes in this new parallel algorithm environment.

When I reached out to him for comment, Marcus Tober, founder and CTO of Searchmetrics, said, “Google now looks at hundreds of ranking factors. RankBrain uses machine learning to combine many factors into one, which means factors are weighted differently for each query. That means it’s very likely that even Google’s engineers don’t know the exact composition of their highly complex algorithm.”

“With deep learning, it’s developing independently of human intervention. As search evolves, our approach is evolving with Google’s algorithmic changes. We analyze topics, search intention and sales funnel stages because we’re also using deep learning techniques in our platform. We highlight content relevance because Google now prioritizes meeting user intent.”

These isolated testing cycles were now very important in order to determine correlation, because day-to-day changes on Google’s index were not necessarily tied to ranking shifts anymore.

The splitting of the atomic algorithm

As if that weren’t enough, in late 2015, Google released machine learning within its search engine, which continued to decouple ranking changes from its standard ways of doing things in the past.

As industry veteran John Rampton reported in TechCrunch, the core algorithms within Google now operate independently based on what is being searched for. This means that what works for one keyword might not work for another. This splitting of Google’s search rankings has since caused a tremendous amount of grief within the industry as conventional tools, which prescribe optimizations indiscriminately across millions of keywords, could no longer operate on this macro level. Now, searcher intent literally determines which algorithms and ranking factors are more important than others in that specific environment.

This is not to be confused with the recent announcement that there will be a separate index for Mobile vs. Desktop, where a clear distinction of indexes will be present. There are various tools to help SEOs understand their place within separate indexes. But how do SEOs deal with different ranking algorithms within the same index?

The challenge is to categorize and analyze these algorithmic shifts on a keyword basis. One technology that addresses this — and is getting lots of attention — was invented by Carnegie Mellon alumni Scott Stouffer. After Google repeatedly attempted to hire him, Stouffer decided instead to co-found an AI-powered enterprise SEO platform called Market Brew, based on a number of patents that were awarded in recent years.

Stouffer explains, “Back in 2006, we realized that eventually machine learning would be deployed within Google’s scoring process. Once that happened, we knew that the algorithmic filters would no longer be a static set of SEO rules. The search engine would be smart enough to adjust itself based on machine learning what worked best for users in the past. So we created Market Brew, which essentially serves to ‘machine learn the machine learner.'”

“Our generic search engine model can train itself to output very similar results to the real thing. We then use these predictive models as a sort of ‘Google Sandbox’ to quickly A/B test various changes to a website, instantly projecting new rankings for the brand’s target search engine.”

Because Google’s algorithms work differently between keywords, Stouffer says there are no clear delineations anymore. Combinations of keyword and things like user intent and prior success and failure determine how Google weights its various core algorithms.

Predicting and classifying algorithmic shifts

Is there a way we, as SEOs, can start to quantitatively understand the algorithmic differences/weightings between keywords? As I mentioned earlier, there are ways to aggregate this information using existing tools. There are also some new tools appearing on the market that enable SEO teams to model specific search engine environments and predict how those environments are shifting algorithmically.

A lot of the answers depend on how competitive and broad your keywords are. For instance, a brand that only focuses on one primary keyword, with many variations of subsequent long-tail keyword phrases, will likely not be affected by this new way of processing search results. Once an SEO team figures things out, they’ve got it figured out.

On the flip side, if a brand has to worry about many different keywords that span various competitors in each environment, then investment in these newer technologies may be warranted. SEO teams need to keep in mind that they can’t simply apply what they’ve learned in one keyword environment to another. Some sort of adaptive analysis must be used.

Summary

Technology is quickly adapting to Google’s new search ranking methodology. There are now tools that can track each algorithmic update, determining which industries and types of websites are affected the most. To combat Google’s new emphasis on artificial intelligence, we’re now seeing the addition of new search engine modeling tools that are attempting to predict exactly which algorithms are changing, so SEOs can adjust strategies and tactics on the fly.

We’re entering a golden age of SEO for engineers and data scientists. As Google’s algorithms continue to get more complex and interwoven, the SEO industry has responded with new high-powered tools to help understand this new SEO world we live in.

The post Can we machine-learn Google’s machine-learning algorithm? appeared first on Search Engine Land.

]]>
8 ways SEO has changed in the past 10 years /8-ways-seo-changed-past-10-years-260686 Tue, 01 Nov 2016 17:07:33 +0000 http:/?p=260686 How has the search landscape changed over the last decade? Columnist Jayson DeMers explores the biggest shake-ups over the last 10 years and their impact on search engine optimization (SEO).

The post 8 ways SEO has changed in the past 10 years appeared first on Search Engine Land.

]]>

Few marketing channels have evolved as quickly or as dramatically as search engine optimization (SEO). In its infancy, SEO was the shady practice of stuffing keywords, tweaking back-end code and spamming links until you started ranking well for the keywords you wanted. Thankfully, Google stamped out those practices pretty quickly, and its search algorithm has never really stopped evolving.

Much of Google’s foundation was in place by the mid-2000s, but how has its algorithm — and as a result, our approach to SEO — changed in the past 10 years?

1. The rise of content

First, there’s the rise of content marketing as part of a successful SEO strategy. Google has steadily refined what it considers to be “good” content over the years, but it was the Panda update in 2011 that served as the death blow to spammy content and keyword stuffing.

After Panda, it was virtually impossible to get away with any gimmicky content-based tactics, such as favoring a high quantity of content while forgoing quality and substance. Instead, the search engine winners were ones who produced the best, most valuable content, spawning the adoption of content marketing among SEOs — and content is still king today.

2. The death of link schemes

Google has provided its own definition of what a “link scheme” actually is, along with some examples. Many find the guidelines here somewhat ambiguous, but the simplest explanation is this: Any attempt to deliberately influence your ranking with links could qualify as a scheme.

By the late 2000s, Google had worked hard to stamp out most black-hat and spam-based link-building practices, penalizing participants in link wheels and exchanges and paid linkers. But it was in 2012, with the Penguin update, that link building really became what it is today. Now, only natural link attraction and valuable link building with guest posts will earn you the authority you need to rank higher.

3. The reshaping of local

Compared to 2006, local SEO today is a totally different animal. There have been dozens of small iterations and changes to the layout (such as the local carousel, and today’s modern “3-pack” layout), but the biggest recent change to ranking factors was in 2014, with the Pigeon update.

With this update, Google more heavily incorporated traditional web ranking signals into its ranking algorithm, giving well-optimized websites a major edge in local search. Google also boosted the visibility of high-authority directory websites in its search results.

More generally, local searches have become more common — and more location-specific — over the last few years, thanks to mobile devices.

4. SERP overhauls

I can’t tell you how many times the search engine results pages (SERPs) have changed, and not many people could; some of these changes are so small, it’s debatable whether to even count them. But take a look at a SERP screen shot from 2006 and compare it to today, and you’ll see how different your considerations must be.

Google search results in 2006.

Google search results in 2006. (Source)

5. The rise of the Knowledge Graph

Another major influencer in modern SEO has been Google’s Knowledge Graph, which first emerged on the scene in 2012. The Knowledge Graph attempts to give users direct, concise answers to their queries, often presenting them with a box of information about a general subject or a succinct answer to a straightforward query. This is great for the user but often takes precedence over organic search results.

Accordingly, optimizers have had to compensate for this, either by avoiding generally answerable keyword targets altogether or by using Schema.org microformatting to make their on-site content more easily deliverable to the system.

6. Mobile prioritization

Mobile devices have exploded in popularity since the iPhone first emerged back in 2007, and Google has done everything it can to emphasize the importance of optimizing websites for those mobile users. Indeed, in 2015, mobile queries officially surpassed desktop queries in Google search.

Optimizing for mobile has become not only common, but downright required these days, in no small part due to Google’s continuing and escalating insistence. Its mobile-friendly update, which occurred in two separate phases, has been a major enforcer of this new standard.

7. The soft death of keywords

Panda and Penguin killed off the practice of keyword stuffing, but a smaller, more curious update in 2013 spelled the “soft” death of keyword optimization altogether. Hummingbird is the name of the update that introduced semantic search, Google’s way of deciphering user intent rather than mapping out individual keywords and phrases.

Today, Google attempts to understand meaning rather than matching keywords, so keyword-centric optimization doesn’t work the same way. However, keyword research is still relevant, as it can help guide your strategic focus and provide you with ranking opportunities.

8. Update pacing and impact

It’s also worth noting that for a time — in the few years following Panda — Google stressed out search optimizers by releasing seemingly random, major updates to its search algorithm that fundamentally changed how rankings were calculated. However, now that the search engine has reached a strong foundation, the significance and pacing of these updates have declined. Today, updates are smaller, less noticeable, and roll out gradually, giving them a much less dramatic impact on the industry.

Final thoughts

Understanding where SEO has come from and where SEO stands today will help you become a better online marketer. Hopefully, by now you’ve long ago eliminated any black-hat techniques in your strategy.

Google — and we, as marketers alongside it — are constantly pushing this now-fundamental element of our lives forward, so if you want to stay relevant, you’ll need to keep focused on the next 10 years of search engine updates.

The post 8 ways SEO has changed in the past 10 years appeared first on Search Engine Land.

]]>
Why links are still the core authority signal in Google’s algorithm /links-still-core-authority-signal-googles-algorithm-255452 Thu, 11 Aug 2016 13:46:36 +0000 http:/?p=255452 Link metrics have been the foundation of Google’s ranking algorithm since the beginning, but could anything ever surpass links as a ranking signal? Columnist Jayson DeMers speculates.

The post Why links are still the core authority signal in Google’s algorithm appeared first on Search Engine Land.

]]>

It’s almost impossible to see any meaningful search engine optimization (SEO) results without spending some time building and honing your inbound link profile.

Of the two main deciding factors for site rankings (relevance and authority), one (authority) is largely dependent on the quantity and quality of links pointing to a given page or domain.

As most people know, Google’s undergone some major overhauls in the past decade, changing its SERP layout, offering advanced voice-search functionality and significantly revising its ranking processes. But even though its evaluation of link quality has changed, links have been the main point of authority determination for most of Google’s existence.

Why is Google so dependent on link metrics for its ranking calculations, and how much longer will links be so important?

The concept of PageRank

To understand the motivation here, we have to look back at the first iteration of PageRank, the signature algorithm of Google Search named after co-founder Larry Page. It uses the presence and quality of links pointing to a site to determine how to gauge a site’s authoritativeness.

Let’s say there are 10 sites, labeled A through J. Every site links to site A, and most sites link to site B, but the other sites don’t have any links pointing to them. In this simple model, site A would be far likelier to rank for a relevant query than any other site, with site B as a runner-up.

links-site-a-site-b

But let’s say there are two more sites that enter the fray, sites K and L. Site L is linked to from sites C, D and E, which don’t have much authority, but site K is linked to from site A, which has lots of authority. Even though site K has fewer links, the higher authority link matters more — and might propel site K to a similar position as site A or B.

link-authority-chart

The big flaw

PageRank was designed to be a natural way to gauge authority based on what neutral third parties think of various sites; over time, in a closed system, the most authoritative and trustworthy sites would rise to the top.

The big flaw is that this isn’t a closed system; as soon as webmasters learned about PageRank, they began cooking up schemes to manipulate their own site authority, such as creating link wheels and developing software that could automatically acquire links on hundreds or thousands of unsuspecting websites at the push of a button. This undermined Google’s intentions and forced them to develop a series of checks and balances.

Increasing phases of sophistication

Over the years, Google has cracked down hard on such rank manipulators, first punishing the most egregious offenders by blacklisting or penalizing anyone participating in a known link scheme. From there, they moved on to more subtle developments that simply refined the processes Google used to evaluate link-based authority in the first place.

One of the most significant developments was Google Penguin, which overhauled the quality standards Google set for links. Using more advanced judgments, Google could now determine whether a link appeared “natural” or “manipulative,” forcing link-building tactics to shift while not really overhauling the fundamental idea behind PageRank.

Other indications of authority

Of course, links aren’t the only factor responsible for determining a domain or page’s overall authority. Google also takes the quality of on-site content into consideration, thanks in part to the sophisticated Panda update that rewards sites with “high-quality” (well-researched, articulate, valuable) content.

The functionality of your site, including its mobile-friendliness and the availability of content to different devices and browsers, can also affect your rankings. But it’s all these factors together that determine your authority, and links are still a big part of the overall mix.

Modern link building and the state of the web

Today, link building must prioritize the perception of “naturalness” and value to the users encountering those links. That’s why link building largely exists in two forms: link attraction and manual link building.

Link attraction is the process of creating and promoting valuable content in the hope that readers will naturally link to it on their own, while manual link building is the process of placing links on high-authority sources. Even though marketers are, by definition, manipulating their rankings whenever they do anything known to improve their rankings, there are still checks and balances in place that keep these tactics in line with Google’s Webmaster Guidelines.

Link attraction tactics won’t attract any links unless the content is worthy of those links, and manual link-building tactics won’t result in any links unless the content is good enough to pass a third-party editorial review.

The only sustainable, ongoing manual link-building strategy I recommend is guest blogging, the process by which marketers develop relationships with editors of external publications, pitch stories to them, and then submit those stories in the hope of having them published. Once published, these stories achieve myriad benefits for the marketer, along with (usually) a link.

Could something (such as social signals) replace links?

Link significance and PageRank have been the foundation for Google’s evaluation of authority for most of Google’s existence, so the big question is: could anything ever replace these evaluation metrics?

More user-centric factors could be a hypothetical replacement, such as traffic numbers or engagement rates, but user behavior is too variable and may be a poor indication of true authority. It also eliminates the relative authority of each action that’s currently present in link evaluation (i.e., some users wouldn’t be more authoritative than others).

Peripheral factors like content quality and site performance could also grow in their significance to overtake links as a primary indicator. The challenge here is determining algorithmically whether content is high-quality or not without using links as a factor in that calculation.

Four years ago, Matt Cutts squelched that notion, stating at SMX Advanced 2012, “I wouldn’t write the epitaph for links just yet.” Years later, in a Google Webmaster Video from February 2014, a user asked if there was a version of Google that excludes backlinks as a ranking factor. Cutts responded:

We have run experiments like that internally, and the quality looks much, much worse. It turns out backlinks, even though there’s some noise and certainly a lot of spam, for the most part, are still a really, really big win in terms of quality of our search results. So we’ve played around with the idea of turning off backlink relevance, and at least for now, backlink relevance still really helps in terms of making sure that we return the best, most relevant, most topical set of search results.

The safe bet is that links aren’t going anywhere anytime soon. They’re too integrated as a part of the web and too important to Google’s current ranking algorithm to be the basis of a major overhaul. They may evolve over the next several years, but if so, it’ll certainly be gradual, so keep link building as a central component of your SEO and content marketing strategy.

The post Why links are still the core authority signal in Google’s algorithm appeared first on Search Engine Land.

]]>
7 e-commerce SEO trends we’re seeing in 2016 /7-e-commerce-seo-trends-seeing-2016-250826 Tue, 14 Jun 2016 16:34:47 +0000 http:/?p=250826 For those managing search engine optimization for e-commerce websites, contributor Jayson DeMers has some advice for what to focus on to stay ahead of the competition.

The post 7 e-commerce SEO trends we’re seeing in 2016 appeared first on Search Engine Land.

]]>

Few types of online business can benefit from SEO more than e-commerce websites that allow for direct consumer transactions. Not only can you secure more web traffic (and a larger stream of revenue), you can also optimize specific product pages to funnel traffic to your most profitable or popular pages.

But SEO (and e-commerce in general) is always evolving. New technologies, new insights and new best practices emerge on a regular basis, and the best e-commerce webmasters are jumping on these changes to stay ahead of the competition.

Below, I’ve compiled a list of seven important SEO trends in the e-commerce industry you should be paying attention to:

1. Out-of-the-box SEO is better than ever

SEO technology is developing just as quickly as the search engines that have inspired them. What do I mean by “SEO technology”? I mean the third-party apps, widgets and tools webmasters can use to optimize their sites and improve results — with minimal manual input required.

In fact, some “out of the box” solutions have emerged in the template web design industry, enabling webmasters to ensure the on-site optimization of their sites in just a few steps upon launch. WordPress plugins have also been around for a while that handle a good amount of on-site SEO automatically, such as Yoast SEO.

These products and developments are tempting, and in fact useful, but currently, there’s no solution that can automatically perform every on-site function. You’ll still need to customize things like your title tags, navigation, rich snippets and so on, if you want to see the best possible results.

2. Long-form content is crucial

Until recently, product pages on e-commerce sites were places for short-form content: a title, a brief description, a handful of photos and a few customer reviews. However, user demand and search engine favoritism have shifted toward long-form content in almost every niche.

Longer-form content provides more detail, more long-tail and conversational phrases (which lend themselves to more relevant search queries) and more market differentiation from the increased competition that has arisen in recent years.

I strongly encourage you to develop more long-form content on your company blog, describing your products and offering insights on your company, provided your topics support that length without unnecessary fluff.

3. Sharability is key

Social media has been popular for many years, but it’s still somehow escalating in importance. In a recent survey I conducted of 357 online marketers, 52 percent of respondents said they are currently seeing a positive ROI from social media marketing, while 65 percent believe it will become even more important over the course of the next five years. Most notably, 96 percent of respondents said they planned to increase their budgets or keep them the same over the next year.

More users are signing up for high-popularity standbys like Facebook, and newer, cutting-edge platforms like Instagram and SnapChat are shaping up to be major hits for younger generations.

One of the best ways to generate more visibility and more primary and secondary ranking signals (like inbound links and social signals, respectively) is to encourage more social sharing throughout the shopping and checkout process.

Have your users share your products. Have them share reviews. Have them share when they check out or when their products arrive. Keep your audience engaged with social opportunities throughout your site, and your visibility across search engines and social media channels will thrive.

4. Video content is outperforming pretty much every other kind of content

As mobile devices, WiFi availability and video sharing capabilities become more advanced and prominent, users are demanding more video content. Video content can show up as rich media in search results (if it’s hosted on YouTube) and has more potential for virality than any other type of content.

In fact, if you aren’t using video content on your product pages and in your company blog, you’re already behind the times. Video content is only going to become more popular, so get moving.

5. Mobile optimization is now absolutely critical

The basics of mobile optimization were already solidified by Google’s Mobilegeddon update, but merely meeting Google’s thresholds for mobile optimization is no longer enough to stand out in the search world.

Mobile optimization is about offering the best possible content and functionality experience to mobile users, who grow in numbers compared to desktop users by the day.

Mobile optimization is also starting to include app optimization, which Google is favoring heavily with developments like app streaming — and one day soon, e-commerce platforms may need to develop their own mobile apps just to survive in terms of visibility.

6. Voice search and digital assistants are gaining popularity and usage

Just a few years ago, digital assistants seemed like useless gimmicks that failed to recognize voices accurately and provided less-than-stellar results even when they did. Now, more people are relying on voice search, and every major tech company seems to have their own digital assistant capable of extraordinary feats, including Siri, Alexa, Cortana and Google Now.

Savvy e-commerce marketers are beginning to capitalize on this trend, offering more colloquial phrasing, more optimization for long-tail phrases and more “rich answers” that digital assistants can provide directly.

7. Local results are becoming more prominent

Local SEO has undergone a handful of overhauls in the past few years, and it’s likely that new technologies (like wearable tech) will increase the importance of local results even further.

E-commerce companies often don’t think about a local strategy, since they operate on a national level and therefore want to target a larger national audience. However, pursuing a local strategy in addition can help e-commerce companies differentiate themselves from the competition and target a smaller, possibly more relevant niche that their competitors are deliberately trying to avoid.

There may be a clustering effect as more e-commerce companies begin to realize the benefits here, which is good motivation to get involved as early as possible.

Final thoughts

Keep an eye on these seven trends to ensure that your campaign remains relevant and visible in the modern era. Depending on your goals and how heavy a role SEO plays in your overall business growth, the suggestions above should take a high priority in your marketing spend.

That being said, these certainly aren’t the only trends I anticipate developing for e-commerce, and it’s hard to predict exactly what’s around the corner — so keep your campaign flexible, and always be on the lookout for the next breakthrough development.

The post 7 e-commerce SEO trends we’re seeing in 2016 appeared first on Search Engine Land.

]]>
The SEO industry is worth $65 billion; will it ever stop growing? /seo-industry-worth-65-billion-will-ever-stop-growing-248559 Mon, 09 May 2016 14:21:39 +0000 http:/?p=248559 SEO spend has been steadily growing since the early days of search engines, but is there an end in sight? Columnist Jayson DeMers looks at what factors might impact the growth of SEO in the near future.

The post The SEO industry is worth $65 billion; will it ever stop growing? appeared first on Search Engine Land.

]]>

Since its early days, search engine optimization (SEO) has always had naysayers insisting that this marketing discipline is a passing fad, or that it’s dead.

Not only has SEO survived this long, it’s thriving: According to a recent study by Borrell Associates, companies are going to spend $65 billion on SEO in 2016. This is more than triple what they predicted for this year back in 2008, before major game-changers like Panda and Penguin even entered the equation.

What’s more, the company is predicting that the SEO industry will continue to grow to an estimated $72 billion by 2018 and $79 billion by 2020.

Though estimates can be fallible, this does suggest that SEO has grown even more than previously expected, with a trajectory to preserve that growth well into the future. In fact, another recent survey of 357 marketers found that more than 90 percent plan to increase their SEO budgets or keep them the same over the next year. Assuming these projections are at least roughly accurate, is there anything that will stop SEO from growing?

Factors for perpetual SEO growth

Let’s take a look at some of the reasons SEO might continue to grow indefinitely:

  • More user searches. It’s likely that the number of searches per user will grow well into the future. Older generations, averse to technology, will make way for younger generations, who rely on technology for everything. Plus, technologies will become faster and more convenient, enabling even more search traffic for each user in circulation.
  • More users. The sheer number of search users will also feasibly increase, compounding the effects of the per-user search growth. This is largely due to the internet becoming more affordable and more available to different demographics. One day soon, thanks to efforts by Google, Facebook and other companies, we may enjoy universal availability of the internet. And technologies such as self-driving cars will give users more time to perform searches at times when they previously couldn’t. These changes will make it possible for almost anyone to search for anything at any time.
  • More outlets for search visibility. There will also be more outlets for search visibility, beyond the conventional search engines we’ve come to know (e.g., Google and Bing). Alternative search engines will certainly rise, but there are two main areas where I expect radical growth: first, the use of digital assistants, which bridge the gap between online and offline search; and second, search engines specific to individual platforms, like app store-based engines, Amazon.com or YouTube search.
  • Decreasing power of traditional ads. Traditional advertising methods have been dying for a long time, and they’ll continue dwindling in power until they eventually fade away. When they finally do bite the dust, a number of businesses dependent on traditional ads as a means of customer acquisition will have no choice but to look to inbound marketing campaigns in the online world to supplement their acquisition strategies.
  • Increasing SEO sophistication. We’re getting better at creating and managing more intense SEO campaigns. As a simple example, what used to be a matter of keyword stuffing and cheap link building has now become an intricate strategy of content development and publication. Furthermore, we have access to more data than we’ve ever had before, and our capacity will only grow from here.

Factors against unlimited growth

And now, some of the reasons why SEO may face an eventual halt or decline:

  • Competition and prohibitive costs. SEO spend rising means that more businesses are getting involved in SEO. That means more competition to deal with. For a while, this will be fine, but eventually the cost of entry will become prohibitive, and there will be a “tipping point” where the rise in spending tapers off.
  • The Knowledge Graph and visibility decline. Thanks to the Knowledge Graph (and similar future technological developments), users are being given more immediate forms of answers, reducing their reliance on individual site visitations to find what they’re looking for. This could eventually start compromising the ROI of SEO, pushing people out of the game.
  • Alternative search modes. Search is starting to evolve in some weird forms, including personal digital assistants, which marry online and device-specific search. These alternative modes of search are harder to predict and harder to “rank” for, since oftentimes they forgo a “ranking” process entirely.
  • RankBrain and decreasing rank predictability. Machine learning is already huge, and it’s only getting bigger. Technologies like RankBrain are starting to upgrade search systems in real time, with processes only AI programs can incorporate. That’s going to make it harder and harder to accurately assess ranking factors and respond accordingly.

The problem with definitions

It’s also important to recognize what may actually qualify as “SEO” in the strict sense. Today, this term largely refers to optimizing a website to be featured higher in organic search rankings, but already it’s starting to apply to other areas, from local results to Knowledge Graph entries, and even digital assistant-based results.

As new forms of search technology evolve, it’s likely that SEO will adapt with the times, rather than dying outright. If that’s the case, spending on what we see as “SEO” today may disappear, but spending on what we label “SEO” in the future may continue to perpetually rise.

The bottom line

It’s hard to look more than a few years into the future with so many variables and potential technological developments in play. However, it’s likely that SEO will continue to grow in popularity, in one form or another, for the foreseeable future.

With that information, you should at least feel comfortable investing further into your existing strategy. For search optimizers, that also means a positive outlook on your job security — as long as you’re willing to adapt.

The post The SEO industry is worth $65 billion; will it ever stop growing? appeared first on Search Engine Land.

]]>
10 ways link building has changed over the last 10 years /10-ways-link-building-changed-last-10-years-246488 Fri, 15 Apr 2016 13:50:34 +0000 http:/?p=246488 Though white-hat tactics have changed over the years, link building remains one of the most important elements of SEO. Columnist Jayson DeMers takes a look back at the evolution of link building.

The post 10 ways link building has changed over the last 10 years appeared first on Search Engine Land.

]]>

Search engine optimization (SEO) is a dynamic digital marketing discipline, and few SEO tactics have evolved as much (or as frequently) as link building.

What was once a spam-laden nightmare of link pyramids, spun content, automated spam posts and other gimmicks has changed dramatically, and over the course of the past decade, link building has become more refined and quality-driven.

Much of this evolution was spurred by search engines directly — Google, for one, played an enormous part in cleaning up the web with its Penguin algorithm updates.

Combined with increasing user distaste for poorly placed links and a collective commitment from webmasters to give their users better experiences, we now exist in a world where link building is respectable, valuable and viable.

Take a look at these 10 major ways link building has changed in just 10 years:

1. Penalties are harsher

Link schemes have always been bad, though they haven’t always been penalized. In recent years, however, Google has stepped up its effort to penalize sites with spammy backlink profiles.

If you engage in a link scheme, you’re more likely than ever before to face a manual penalty that will drop you from the SERPs. Recovery is always possible, but these types of schemes have the potential to set you back months, or even years.

2. Crappy links don’t work anymore

Just 10 years ago, a link was a link. You could easily get away with posting a non-contextual link pointing back to your domain as a forum post, spammy blog comment, article on an article directory, or even a free community blog.

Today, such tactics are no longer tolerated; site editors know that if they don’t keep their sites free of spammy links, Google will penalize them. And if Google catches what it considers to be spammy links, it will neutralize the link’s value, leaving you with practically no authoritative gains. If Google detects a pattern of spammy links, you’re likely in for a manual or algorithmic penalty.

3. Same-source links have greater diminishing returns

Links from different domains have always returned more value than additional links from the same domain; this is because links serve as third-party indicators of credibility, and links from the same domain offer a redundant vouch for authority.

However, this effect of “diminishing returns” has escalated over the past 10 years. Today, same-domain links probably still have some value, but they are also probably less valuable than ever.

4. Guest posts have become the gold standard of off-site link-building tactics

Guest posting, the process of writing an article and getting it published on an external publication, has come to be the “gold standard” of link building. Because the focus is on creating quality content to reach a new audience, it has far more value than just SEO value. There’s virtually no risk of penalty, and it’s not so complicated or intensive that the effort it takes outweighs the reward.

Guest posts were always a good strategy, but in my opinion, they are currently among the best strategies when it comes to off-site tactics.

To clarify, I’d argue that publishing quality content to your own website that attracts inbound links on its own merit is the absolute best tactic for link building, but I don’t consider that an off-site tactic. It’s also impractical to build links this way in many industries (or without a pre-existing audience, which necessitates off-site tactics).

5. Content standards have risen

This isn’t to say that anybody can contribute as a guest anywhere they want. The popularity of guest posting has had another effect on online communities: Thanks to increasing competition and awareness of the value of link building, most major publishers have significantly increased their content standards from outside contributors.

This means it’s much harder to land guest posting opportunities, necessitating building real relationships with editors and webmasters, and it’s much harder to produce high-enough quality content for those publications.

6. Press release links are pretty much worthless

Press releases were once a popular tactic for link building. News sources were extremely high in authority, and as long as you had a newsworthy topic, it was fairly easy to get yourself a featured link by writing and submitting a press release through one of the major press release distribution hubs.

However, thanks to the surge in popularity of this tactic, Google has significantly downgraded the authoritative power of links from press releases.

7. Keyword-rich anchor text can get you in big trouble

A decade ago, using keyword-rich anchor text was the best way to give specific ranking power to your inbound links. Today, Google’s quality evaluations are so sophisticated that they can detect unnatural use of anchor text for manipulative purposes, and it’s now among the most recognizable indicators of a spammy link. Anchor text should be natural to avoid triggering a penalty.

8. Link earning is a viable tactic

The phrase “link building” refers to the manual process of placing links on external sites. But as I mentioned in #4, there’s an even better way to get natural links: earn them on your own with fantastic content that acts like a magnet to attract links.

Years ago, this wasn’t a very attractive link-building tactic because even though Google had guidelines on spammy links, those guidelines were not enforced. So the fastest and cheapest way to improve your search visibility was through spammy, manipulative tactics that were highly popular. Most importantly, those tactics worked.

Now that Google does a good job at discouraging spammy, manipulative link-building practices, link earning is once again a viable tactic.

9. It’s harder than ever to break in

Though many of these developments have made link building simpler (just create, publish and distribute high-quality content), it’s actually harder than ever to build good links.

If you’re going with the “link earning” method, getting natural inbound links on the merits of your own content requires an established audience or some level of pre-existing authority, which makes starting from scratch a major obstacle for new startups and small businesses. It often takes a boost from an existing authority, perhaps through guest posting, to start building an audience and your brand.

10. It isn’t all about rankings

Yes, the primary focus of link building is earning more authority to rank higher in search engines, but there are far more benefits than just rankings. Brand visibility, author reputation and referral traffic are just some of the peripheral ways you can benefit.

Today, link building is about giving customers better content and better experiences in general. If you provide more original, practical, valuable content, every guest post or on-site piece you publish is going to earn more visibility (and therefore, more inbound links).

Though there’s still a bit of a technical science to it, link building can no longer be reduced to tricks and gimmicks. It makes link building more complicated, but at the same time, infinitely more rewarding.

The post 10 ways link building has changed over the last 10 years appeared first on Search Engine Land.

]]>
How to find relevant and proof terms for your content /find-relevant-proof-terms-content-241164 Mon, 04 Apr 2016 16:33:15 +0000 http:/?p=241164 Quality content is great, but it won't improve your SEO if it's not based on solid keyword research. Columnist Jayson DeMers shares his tips for focusing your content on the right keywords.

The post How to find relevant and proof terms for your content appeared first on Search Engine Land.

]]>

The worlds of content marketing and SEO are constantly changing, and it can be difficult to keep up. If you’re like many business owners, you may find that all that costly and time-consuming content you’re creating just isn’t getting the search visibility you believe it deserves.

Two new terms you may be hearing more about are “relevant” and “proof” terms. In their 2015 Ranking Factors report, Searchmetrics looked at top-ranking sites to see which factors were important for ranking, specifically in Google; two factors that were revealed to be integral were relevant and proof terms.

Finding: The highest-ranking pages contained a 53-percent proportion of relevant terms.

Relevant keywords are ones that suggest to Google that you’re providing comprehensive coverage of a topic. For example, if I were writing about the iPhone, I’d probably naturally use words throughout my article such as “technology,” “apps” and “iOS.” From scanning billions of other pages on the internet, many of which focus on the iPhone, Google knows what related words are most likely to appear in an article about iPhones. So using such words essentially shows Google that I’m providing a sufficient breadth of information related to my main theme.

Finding: The highest-ranking pages contained a 78-percent proportion of proof terms.

Proof terms, on the other hand, are words or phrases that must be used when discussing a particular topic. Using the example above, these might be words like “phone,” “Apple,” and perhaps “mobile.” These words or phrases essentially prove to Google that I’m actually covering the topic at hand.

It’s essential to use relevant and proof terms on a page to attain high search rankings for any given keyword. But how do you go about finding and incorporating these keywords into your content? The good news is that it’s probably easier than you think. This post will walk you through the process.

Keep in mind that isolating and identifying specific proof terms can be difficult, as there is no standard list or set you can use as a reference. Much of this process will be speculative and will rely on your own research skills and in-depth knowledge of the topic you’re covering.

1. Use keyword tools

Related keywords could be synonyms or variations, or they could be words or phrases that are commonly used alongside your primary keyword. For instance, using a primary keyword like “cat,” we could identify a number of related and/or proof terms, such as:

  • feline (synonym);
  • cats (variation);
  • collar (sub-topic);
  • kitten (synonym);
  • cat breeds (sub-topic); and
  • bengal (sub-topic).

Virtually any keyword tool will allow you to identify related terms. Google’s Keyword Planner is a great place to start. After plugging in your primary keyword, you can browse through the Keyword ideas tab; however, I prefer to use the “Ad group ideas” tab to find more specific words and phrases. Once you’ve identified a sub-topic you want to pursue, you can click on the phrase to drill down even further to find more related terms.

google-keyword-planner

Ubersuggest is another great tool that can provide you with thousands of related keyword suggestions. Because the sheer volume of results can be rather overwhelming, I suggest entering a more specific multi-word phrase to find the most relevant suggestions. Ubersuggest allows you to choose whether you’d like to search the entire web, images, news, shopping, videos or recipes.

Use your relevant keywords to create sub-topics and sub-headings for your content. This will not only help you structure your content more efficiently, it will also provide additional SEO benefits when you use them in your heading tags (H1, H2 and so on).

2. Use Google’s “Searches related to” feature

If you want to see which words and phrases Google feels are semantically related to your primary keyword, this is a great strategy. Simply use your keyword in a search query, and then scroll down to the bottom of the page to see the “Searches related to” results.

searches-related-to

You can also use Google’s autocomplete feature to see which words and phrases are typically queried in relation to your keyword. Keep in mind that these results may be impacted by relevant searches you’ve performed in the past (if you’re signed in to your Google account).

google-auto-complete

3. Identify words and phrases top-ranking pages are using

Since we already know that high-ranking pages contain a high proportion of relevant and proof terms, it makes sense to gain insights directly from the winners. Identify the top-ranking pages for your keyword, and then see which words and phrases are being used (besides your primary keyword).

This can be a difficult process to perform manually, but there are some tools that can help. SEO Book’s Keyword Density Analyzer can show you exactly which words are used on a page and how often. It will tell you which words and phrases are used both in the content of the page and in the metadata.

keyword-density-analyzer

4. Low-tech brainstorming

There’s a chance that even after using the strategies above, you may be missing out on some important related and proof terms. Ask yourself which words or phrases you would use to find information related to the topic. What variations might you use? What sub-topics would provide useful insights into the main topic? What words or phrases are intimately related to your primary topic?

Visiting industry forums can also clue you in to important keywords. See what questions are being asked and which topics are frequently referenced in relation to your keyword. Visit popular blogs in your niche to see which topics they frequently cover. Look at their high-ranking posts to see which words or phrases are frequently used in regard to your topic (Use the strategy outlined in #3).

5. Creating comprehensive content will naturally lead you to include relevant and proof terms

While the tools and strategies above will certainly work to provide you with many relevant and proof terms, there’s another strategy that’s just as effective and perhaps even easier to implement. By focusing on creating long-form, comprehensive content, you will naturally incorporate many relevant and proof terms.

While the strategies above can give you important insights into topics and keywords you can use, there’s no substitute for creating content that covers every angle of a topic.

Creating this type of comprehensive content will necessitate using two main strategies: First, make sure your content is long enough to sufficiently cover the topic. The Searchmetrics study found that the top-ranking pages contained an average of 1,285 words. Second, focus on creating holistic content. This is more a mindset than a specific strategy. Essentially, it requires that you focus on expanding your topic to include all the relevant information your readers will want.

As the report authors point out, focusing only on keywords will only get you so far: “Focusing your optimization on single keywords or keyword lists without providing truly relevant content for the user will not result in long-term success.”

Writing holistic, comprehensive content, on the other hand, will naturally incorporate many/most of your relevant and proof terms and will be far more likely to result in high rankings over the long term. Just be sure not to approach the realm of keyword stuffing.

Conclusion

As you can see, there’s no cut-and-dried way to find proof and relevant terms, or at least no way to definitively know which ones Google wants to see.

However, by using keyword tools and insights from Google, seeing which words and phrases top-ranking pages are using and focusing on creating comprehensive content, you’re well on your way to providing information that both Google and your readers will love.

The post How to find relevant and proof terms for your content appeared first on Search Engine Land.

]]>
Is modern SEO more than the sum of independent parts? /modern-seo-just-sum-independent-parts-242638 Mon, 14 Mar 2016 14:30:24 +0000 http:/?p=242638 Columnist Jayson DeMers asserts that what we think of as SEO is actually just a combination of different customer experience strategies woven together to create the best online presence possible.

The post Is modern SEO more than the sum of independent parts? appeared first on Search Engine Land.

]]>

When you think of SEO, what do you actually think about? If you were going to “practice” SEO, what would you be doing? Would you be writing content? Analyzing your performance? Engaging with your audience on social media?

Modern SEO is a complex, multifaceted collection of different sub-strategies, nearly all of which can function independently on their own as a way to boost brand visibility and build customer relationships. As a quick example, content marketing is a necessary strategy for SEO, but even without a deliberate SEO process, it can be valuable in terms of increasing customer engagement and building brand trust.

With that being said, is modern SEO anything more than just the sum of its interconnected parts? Is there any one strategic initiative that functions exclusively to increase a brand’s rankings for various search queries?

The constituents of SEO

I’m not going to try and list every little factor or tactic that could conceivably impact a company’s organic search rankings, so don’t expect this to be comprehensive. Instead, this is going to serve as a general list of strategies that all feed into a brand’s search engine performance, one way or another:

  • On-site optimization. This is a general term that covers all kinds of technical improvements and creative choices. Mobile optimization, site speed, site security, meta titles and descriptions, rich snippets and structured data, site architecture, site mapping, navigation structuring and content availability are just some of the ways you can optimize your site directly to be found and favored by search engines. But almost all of these strategies are as much about improving customer experience as they are about making search engines happy: better structured, faster sites are easier to use.
  • On-site content. On-site content could be called “content marketing,” but I avoided using the term here because content marketing is sometimes associated with a blog. On-site content, on the other hand, includes all pages of a site. The quality, accuracy, conciseness, detail and uniqueness of your content can all help your search rankings (as can the frequency and consistency of your posts), but primarily, this content serves as a means of building customer loyalty.
  • Link building. Link building exists in a few forms. Traditional link building could be considered an SEO-exclusive strategy because that’s its primary function (and most people aren’t interested in referral traffic for these links). However, more advanced, modern link-building tactics involve guest posting and content syndication — and these have far more brand visibility benefits than just ranking higher in search engines.
  • Social media. Social media is often lumped into the “SEO strategy” category, but it actually doesn’t influence SEO directly at all. Instead, it’s a kind of SEO conduit. Engaging with a wider audience means more people to see and share your content, leading to more potential inbound links, which can then influence your website’s organic search rankings.
  • Local SEO. Local SEO strategies specifically involve getting your business listed accurately on third-party directories and review sites, then managing your online reviews. Doing so can increase your chances of earning a slot in Google’s local 3-pack — but more importantly, these efforts increase your reputation with customers.

Do you notice a pattern here? All of these approaches can be referred to as “SEO strategies,” and all of them can help increase your search visibility. Yet they can (and sometimes do) function independently of SEO to improve customer relationships and experiences.

You can group this suite of services together as “SEO,” but there’s no strategy listed here that’s exclusively focused on improving search rankings.

Keyword-based SEO is dead

It’s also worth mentioning that traditional concepts of SEO — that is, doing a certain amount of online work to rank for a selection of specific keywords — are obsolete. It’s become far more difficult to rank for specific keyword terms these days, thanks to Google’s semantic search functionality, increased sophistication, increased competition, more paid features and the Knowledge Graph.

That being said, traditional concepts of “SEO” are practically dead. Modern SEO is all about using different customer experience strategies together to give your brand the best online presence possible.

Arguing over semantics?

You could accuse me of arguing over semantics here, but understanding that modern SEO isn’t an independent strategy (instead being a collection of other independent strategies) is important both for SEO agencies and for independent practitioners.

It’s the responsibility of SEO agencies to make sure every client understands what really goes into SEO — and selling “SEO services” without selling at least some of those other services (e.g., content marketing) is like selling a car without wheels.

The bottom line

Some tactics — including rich snippets and meta descriptions — are executed for the purpose of altering how search results appear, but it’s still important to realize that modern SEO doesn’t exist in a vacuum. It’s more about the complex interrelationships between different online visibility and user experience strategies and less about any one tactic that’s meant to increase your rankings.

SEO is still very much alive and still important, but only in its context as an aggregation of other important strategies. Keep this in mind as you optimize your online presence, both on-site and off-site.

The post Is modern SEO more than the sum of independent parts? appeared first on Search Engine Land.

]]>
Why Grave Misconceptions About SEO Still Persist /grave-misconceptions-seo-still-persist-239008 Thu, 14 Jan 2016 14:20:44 +0000 http:/?p=239008 Myths and misconceptions about search engine optimization just don't seem to go away. Columnist Jayson DeMers offers an explanation for why this is so and what SEO professionals can do about it.

The post Why Grave Misconceptions About SEO Still Persist appeared first on Search Engine Land.

]]>

SEO has been around for almost as long as search engines, but attitudes toward the industry and the specific tactics used by SEOs have remained in constant flux since the beginning.

Nevertheless, most of the fundamentals of SEO have remained consistent. Despite this consistency and the relatively predictable nature of gradual, iterative Google algorithm releases, misconceptions about the strategy still persist.

Two Types Of Misconceptions

If you break it down, there are two broad categories of misconceptions about SEO, each of which is damaging in its own way.

The first category is misconceptions about what SEO is and what it’s used for. Misconceptions here include:

  • SEO is a strategy for spammers and scammers, using black hat tactics to manipulate rankings.
  • SEO is expensive and unpredictable and has no real measurable ROI.
  • SEO is a fad or a temporary strategy that isn’t worth pursuing in the long term.
  • One wrong move in SEO can ruin everything you’ve worked for instantly and permanently.

Misconceptions in this category prevent people from pursuing the strategy altogether, and they indirectly weigh on the reputation of the industry.

The second category of misconceptions is related to the strategic execution of a campaign, such as:

  • SEO is about including as many keywords as possible on your site.
  • SEO requires you to build links everywhere you can (or never build links — both ends of the spectrum exist here).
  • Creating more pages on your site will have a meaningful and direct influence on your rankings.
  • If you work hard enough, you can earn a top ranking for any keyword in a few weeks.

These can be even more dangerous, since they influence real actions and can warrant a penalty if abused egregiously enough. Otherwise, they lead to very disappointed campaign executors who don’t see the results they think they will.

So why do all these misconceptions still persist, despite the overwhelming volume of content available explaining them away?

To Some, The Internet Is Magic

First, remember that some business owners and marketers — especially those from an older generation — view technology (and the internet especially) as an unnecessary novelty. They’re used to advertising their businesses through traditional means, such as word-of-mouth and printed ads.

The internet is something magical to them, and rather than trying to understand it, they write off any strategies that don’t make immediate sense to them.

These business owners are hard to convince because they’ve had so much experience with more tangible strategies that have worked.

First Impressions Last

When SEO first emerged, it was a somewhat questionable strategy. Spammy keyword stuffing was a viable means of getting your page ranked, spammers tended to reap the greatest benefits, and most people had to wade through pages of results to find what they were really looking for.

This picture of SEO became a first impression for an entire generation of marketers, and unfortunately, that first impression continues to exist in a fraction of the population.

Had SEO first emerged as the well-balanced, technically complex strategy it is today, many of these misconceptions might not have had the chance to form.

SEO Is Always Changing

It’s also worth noting that SEO is constantly moving and changing. As I mentioned, keyword stuffing was the norm until Google took countermeasures to devalue such spammy tactics.

But think of all the Google updates we’ve seen in the last few years alone — Panda, Penguin, Pigeon, Hummingbird, Mobilegeddon, and the list goes on.

Since each update buries a handful of old practices and introduces a handful of new ones, new misconceptions are born from nothing every few months. The process is gradual, so it’s hard to notice, but it’s there, and it’s constantly introducing new misconceptions to the field.

Not All Agencies Are Well-Intentioned

While most modern SEO agencies with a decent reputation have earned their place as trusted authorities, there are still dozens, if not hundreds, of ill-intentioned scammers taking advantage of people who know nothing about SEO. They make unreasonable promises, offer obsolete services and generally use bad practices to manipulate rankings for short-term gain.

If a person’s first impression of SEO comes from one of these disreputable sources, it could easily leave a bad taste in their mouth (or set them up for failure if they choose to continue).

Results Can Be Misleading

Last, but not least, some misconceptions arise naturally because SEO can be difficult to interpret.

If you introduce three or four new strategies and notice an uptick in your traffic, it could be any one strategy, or some combination of them, that was responsible for the growth. Or it could be a fluke occurrence unrelated to what you changed.

The misinterpretation of results and patterns leads to new misconceptions being formed every day — at least for those trying to make an impact.

Conclusion

No matter how many articles are written about these misconceptions or how much evidence surfaces contradicting them, people will still buy into them, at least to some degree. And, as SEO continues to evolve, new misconceptions will continue to emerge.

If you’re an SEO provider, the best thing you can do is explain these misconceptions to your target demographics as clearly as possible (maybe even bringing up why so many misconceptions exist) and offer to try and prove them wrong.

If not, others’ misconceptions really aren’t your problem. Just keep yourself as informed as possible, and try not to fall behind the times. In fact, if you’re a business owner and stay up to date, other people’s misconceptions could benefit you, so long as they keep your competitors from implementing effective SEO.

The post Why Grave Misconceptions About SEO Still Persist appeared first on Search Engine Land.

]]>