Conrad Saam – Search Engine Land News On Search Engines, Search Engine Optimization (SEO) & Search Engine Marketing (SEM) Mon, 03 Jul 2017 14:01:58 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.1 More content, less traffic: Part II /content-less-traffic-part-ii-277374 Mon, 03 Jul 2017 14:01:58 +0000 http:/?p=277374 Trying to determine whether it's worth your time to invest in ever more website content? Columnist Conrad Saam lays out a framework to help you decide.

The post More content, less traffic: Part II appeared first on Search Engine Land.

]]>

At the end of last year, I shared the results from three of our projects showcasing how we had increased traffic by reducing a website’s overall page count. Simply put, many of our clients had way too much content — pages and pages of content that not only didn’t generate traffic, but also had the unintended consequence of hurting traffic sitewide by diluting keyword and topical relevance. By consolidating pages and reducing the total page count of these sites, we saw some dramatic improvements in overall traffic.

The purpose of this article is to lay out an analytical framework to answer an important question: How do I determine if I should continue to increase the content on my site?

First, I should note that I work in the hypercompetitive legal industry — and lawyers have been belching out web content at a stupendous rate after hearing time and time again that “Content is king” and that “Google likes fresh content.” Law firms around the country have hired small armies of low-end writers, outsourced content abroad and thinly plagiarized each other like mad. Long-tail legal content? Yup, we’ve got that in spades — search for a “lesbian car accident attorney” in any major city, and you’ll find not only pages optimized for that, but entire directories.

The reality is, there’s plenty of legal content out there — it’s really a matter of which legal content rises to the SEO surface.

My content-crazy client

In our most extreme example, we had a new client with a small team who had generated over 4,000 pages of domestically written, lawyer-reviewed, reasonably high-quality content. He had been at this for about three years and was averaging 25 new pages of content a month. Through some aggressive content pruning, we’ve cut about 40 percent of that content and seen dramatic, consistent improvement in their overall performance. Traffic has risen over 50 percent during the four months we’ve been at it.

The most difficult part for our agency was convincing the client to toss all of this work. After all, his content investment was substantial.

The framework for assessing IF you should continue investing in content is the same framework I used to convince him to stop burning his kid’s college fund on a tactic that wasn’t truly benefiting him.

Percentage of pages generating traffic

The most obvious approach is to calculate the likelihood that additional content will actually generate any traffic. To do this, I reviewed six months’ worth of SEO traffic.

First, I used Google Analytics to determine how many pages showed up in the Landing Pages Report with an Organic Segment overlay. This number is the numerator.

Next, I determined the overall page count; you can do this utilizing either Google Search Console OR a site: search to get the indexed count. That’s that denominator.

The resulting fraction represents the likelihood that a new page of content will actually generate traffic over a six-month period.

Now, you can take this to a much more sophisticated level. For example, not all content generates conversions at the same rate — in the legal industry, for example, blog traffic generates conversions at a much lower rate than practice area page traffic. But overall, this is a reasonable framework for answering the question, Should I be creating more content?

Back to our content-crazy client. In the graph below, he is represented by the data point at the far bottom right-hand side of the graph — almost 3,500 pages of content (about 15 percent of his content hadn’t even been indexed), each of which had about one chance in four of actually generating traffic. In order to dissuade our new client from continuing on his content addiction, I graphed him relative to all of our other clients. Your upside? I’m sharing those data points with you below to see where your own data falls.

In the graph above, you can clearly see that many law firm sites have way too much content, while a handful of our clients (at the top end of the graph) should be aggressively generating more content, as any individual page has a very high likelihood of generating more traffic.

Sessions per page

Another analytical lens through which to view the “Should I generate more content?” question is to calculate the average number of organic entry sessions per page. Essentially, if a page does generate traffic, how much traffic would that be?

Our content-crazy client is once again underperforming, averaging just over one session per page every six months. Also worth noting are the many lawyers that have started and failed with blog content, seduced by the promise of SEO silver bullets based on the “content is king” premise.

Getting down to business

To further dissuade my content-loving client from churning out yet more legal prose, I pushed the analysis into something more concrete: the number of months it would take for him to generate a new inquiry with his reliance on more and more content. Considering we’ve found that roughly 4 percent of organic sessions generate a business inquiry for law firms, and the client was barfing out 25 pages of content a month, we determined it would take roughly 17 months for this tactic to pay off in an incremental inquiry.

With those numbers in mind, the “more content” tactic starts to look truly ridiculous — and you can see from the other data points that most websites within the legal industry focus far too heavily on content.

It’s not a far stretch to take this analysis to a more business-centric mindset and actually generate an expected cost per client, or even ROI based on generating more (and more and more and more) content.

The post More content, less traffic: Part II appeared first on Search Engine Land.

]]>
Local search update: ‘Best’ filter invading the local pack /local-search-update-best-filter-invading-local-pack-273798 Tue, 25 Apr 2017 17:58:19 +0000 http:/?p=273798 We all know good ratings and reviews are important for local businesses, and it seems they're becoming even more so. Contributor Conrad Saam explains a new search results display that appears when queries include words like 'best' and 'great.'

The post Local search update: ‘Best’ filter invading the local pack appeared first on Search Engine Land.

]]>

Google seems to be making moves to use the quality of small businesses — as determined by user ratings and reviews — as a ranking factor.

Today, the search giant has begun incorporating the quality element directly into the snack pack’s interface for some local searches (Hat tip: Dave DiGregorio). Note below on the search for “best Atlanta personal injury Lawyer”:

The functional filter appears ubiquitously for queries with the word “best” and other such superlatives. “Awesome,” “outstanding” and “great” all trigger the filter, although “stupendous,” “kick-a**” and “supreme” didn’t.  Note that even without a qualitative element in the queries, the filter still shows, though it’s not prefiltered for 4.0-star and higher rated businesses, and it appears in gray rather than in red. :

I’ve suspected that Google has been looking to increase the focus on the qualitative element of local search results based on things that have been showing up, and have been talked about, in the industry where I work — the legal vertical. For example, we’ve seen SERPs with organic results from legal directory and lawyer rating service Avvo sandwiched between the ads and the local results on occasion. That site is specifically optimized for the word “best.”

This focus on quality is now showing up directly in the snack pack — making positive reviews an increasingly important component of the small business marketing arsenal. It’s not clear what weight the reviews take in the rankings. In the first example above, the third result actually has a higher rating than the second one. This could be because the second result has a higher volume of reviews — more than twice the number of reviews — or this could be due to physical proximity to the searcher or some other factor.

The post Local search update: ‘Best’ filter invading the local pack appeared first on Search Engine Land.

]]>
More content, less traffic: part I /content-less-traffic-part-262960 Wed, 07 Dec 2016 16:54:20 +0000 http:/?p=262960 It's important to have high-quality content on your website, but columnist Conrad Saam believes that SEOs might be overdoing it.

The post More content, less traffic: part I appeared first on Search Engine Land.

]]>

“Content is king” is the familiar SEO refrain that has spawned umpteen pages of thin, vapid website content. The push toward more and more content was mitigated somewhat by the next refrain, “quality content,” following Google’s numerous Panda updates.

But a widespread misconception perpetuated by the SEO industry persists: you need to continue to feed the content beast — otherwise, The Google won’t like your site.

So many marketers continue to push towards more sites, more pages, more content, more, more, more, more, more. The perceived need for more content is a convenient straw man excuse for agencies running a failed SEO campaign:

“Hey client, it’s not our fault your site isn’t performing — you just need to blog more. Still not working? Try more blogging!”

The focus on content is grossly overblown. Content is very rarely the answer — especially in industries where many sites have too much of it already.

I’m not sure there are any verticals where the push for more content is more pronounced than the hyper-competitive legal market. An entire cottage industry has sprung up, rewriting variants of “car accident attorney,” “Top 10 Things to Do After you are Rear Ended in Cleveland,” and “how to select a personal injury lawyer in Topeka,” in an attempt to “win” the SEO war for clients.

Law firms now have small teams of in-house content developers dribbling out prose that rarely sees an inbound visitor. This, of course, is exacerbated by poor installations of WordPress, which frequently generate multiple pages of identical content through the overly aggressive implementation of tags, categories and author pages.

I ran into the value of content minimization and curation almost a decade ago while working on Avvo’s Q&A section, in which lawyers respond to user-submitted questions — and each question generated a new page of content on the Avvo site. The most common example was some variant of the following: “I’m 19 and my girlfriend is 17… can we legally have sex?”

We had thousands, if not tens of thousands of pages generated by variations of this question. The vast majority of those pages didn’t receive any inbound traffic and served to bloat the site’s page count. Through a careful content curation process, we were able to consolidate this content into a series of high-quality content pages. And we saw an overall lift in inbound traffic to this type of content.

Now, that was years ago. Today, I work directly with law firms — and given the push toward more and more content, I find myself again dealing with sites that are bloated with repetitive, duplicate and otherwise underperforming content. What follows are three case studies for law firms in which we’ve axed a bunch of site content with a resulting persistent increase in inbound search traffic.

Site I: Pages down 63%; traffic up 61%

Between July 8 and October 11, 2016, we went through a process of consolidating and removing pages from this site. Some of the content was genuinely useful to a user but should never serve as a destination for search traffic. A good example here is pages with laws and statutes copied directly from state government sites — 50 pages in all for the different laws in 50 different states. These pages are genuinely useful to a user, but they should never be indexed.

Other pages were simply thin variants of content themes that needed to be consolidated into a single, specific page. In the graphs below, you can see an initial drop in reported index count in mid-August, followed by a dramatic drop in October, which is followed by an immediate 61 percent increase in search traffic.

content-i

Site II: Pages down 32%; traffic up 36%

In the second example, we cut content on three different dates. This is a WordPress site that vastly overused tags and categories, resulting in tons of duplicate content. They had also generated a large volume of extremely thin Q&A-style pages in response to the Hummingbird update.

On June 10, we dropped 10 pages of exact duplicate content. Following that, on June 21, we no-indexed 147 pages generated from media, tags and categories. Finally, on September 27, we consolidated 65 pages with very thin Q&A-style content. As you can see in the graphs below, the slow, steady decrease in indexed pages was mirrored by a slow, steady increase in inbound organic traffic.

content-ii

Site III: Pages down 13%, traffic up 37%

The third example is a site for which we have had trouble generating significant improvement. But on October 5, 2016, we no-indexed 149 pages generated by WordPress tags and removed them from the XML sitemap. Note that many of these pages weren’t indexed anyway; although based on the graph below, at least some were.

We did the same for the site’s 10 different author pages. You can see the immediate impact in the persistent traffic bump below. This is a stark example of how a well-intentioned but aggressive implementation of WordPress’s auto-generating pages can reduce inbound traffic.

content-iiib

More content, more problems

Of course, none of this should be surprising; it’s not like Panda is a new concept. But we keep hearing the chorus of “content is king,” which encourages site owners (small businesses especially) to keep posting. What we’ve found is that, in many cases, less content means more traffic.

In my follow-up post, I provide an analytical framework to answer the question: Should I continue to invest in more content, or is my marketing dollar more wisely invested elsewhere (like my kid’s college fund)?

The post More content, less traffic: part I appeared first on Search Engine Land.

]]>
The immediate results of link building /immediacy-link-building-258239 Fri, 16 Sep 2016 14:30:00 +0000 http:/?p=258239 While conventional wisdom and recently published studies may hold that link building takes a long time to have a positive impact, columnist Conrad Saam begs to differ and shares four case studies.

The post The immediate results of link building appeared first on Search Engine Land.

]]>
Link building inefficiency imageIn an April 2016 article on Moz by Kristina Kledzik, “How Long Does it Take for Link Building to Impact Rankings,” the author posits that new links take about 10 weeks to have an impact. Our findings show the exact opposite: High-end link building has an immediate and persistent positive impact. I’ll explain further and share our data here.

Kledzik’s study was on a moderately large site with roughly 200K pages and focused on moderately difficult keyword rankings. She correctly points out that with a multiplicity of variables, it is extremely difficult to pinpoint the impact of any given ranking factor (such as a link).

Our study was conducted entirely differently and, I think, gives a better picture of why, in our experience, the impact of links is immediate and persistent.

Background on the study

First, it is important to know that we work exclusively with the legal industry — which means we’re bringing high-end link building to small business websites — so the impact should be more extreme. Second, to try to isolate the impact of links, I’ve cherry-picked (just) four case studies for link-building projects over the past four years that fit all of the following criteria:

  • extremely successful in generating high-quality links;
  • for non-retainer clients (i.e., with no other SEO work being done); and
  • concentrated in a very short time frame, with the actual link-building component constrained to just a few days.

Fundamental differences in the studies:

  • Lower authority. In general, these small business sites have lower authority — therefore, we’d expect the impact of any changes to be more dramatic.
  • A massive influx of links. For these clients, we didn’t create just one or two new links, but rather made a concerted campaign to drive a large volume of high-quality, relevant links in a short period of time, sometimes increasing the volume of quality links by 50 percent to 300 percent in less than two weeks.
  • Focus on traffic instead of rankings. We use traffic as our metric of success, not rankings, because traffic captures long-tail queries and site-wide benefits to individual links and is more relevant as it translates into actual business. This is more pronounced with local businesses like law firms, as local results are heavily influenced by links; therefore, overall site traffic should grow faster with overall domain authority improvements. This is especially the case in hyper-competitive localized markets like legal, where all best-practice fundamentals are in place and a site’s authority profile is a tie-breaker when it comes to visibility.
  • Fewer variables. Because we are dealing with small business websites, there are fewer additional variables that could explain traffic fluctuations. For these sites, nothing else is happening beyond these link-building campaigns.

Methodology note: Some of these link-building campaigns were undertaken in response to current events. In the graphs below, I’ve filtered out SEO traffic going to that specific content because I wanted to note the difference in traffic to the rest of the content on the site.

Case I: social justice

In this situation, we ran a link-building program for a law firm involving a specific women’s rights issue. All links were generated within a three-day time period. Inbound traffic jumped immediately by roughly 20 percent upon launch of the campaign.

SJ

Case II: class action

We represented a law firm filing a class action suit against a restaurant chain and utilized social media to reach out to prospective litigants and concurrently generated links regarding the suit. In the graph below, you can see a 17-percent increase in inbound search traffic (bottom graph) immediately after the massive influx of traffic from the paid social media campaign (top graph).

Case II

Case III: famous injury lawsuit

This was a one-off link-building engagement, in which our client filed an injury suit against a very well-known technology company. The resulting coverage in large nationwide news sites spilled over into secondary news sites, with 114 domains linking overall.

You can see the bump in search traffic (driven by brand name search), followed by an immediate and persistent 46-percent increase in search traffic across the domain. Given the extent of the new links, I frankly expected a more marked improvement.

Upon further investigation, we found the site’s technical platform had numerous issues. We were essentially dropping a huge volume of links on a broken site, and the resulting increase in traffic was underwhelming, given the heft of the new links.

Case III

Case IV: Turkey Day blowout

I saved this one for last because a) it was most successful; and b) there is a delay between the link-building efforts and the traffic increase (countering my claims about the immediacy of link-building impact).

This was an extremely clever content marketing project that was deliberately set to launch at Thanksgiving. And if you are intimate with traffic patterns for legal, you’ll know that website traffic for law firms craters during the Thanksgiving/Christmas/New Year period. However, once those holidays passed, the firm saw a persistent 65-percent increase in inbound search traffic.

Case IV

So, while it’s only four data points, in each case where we’ve dramatically improved a site’s authority profile through high-quality links, we’ve seen an immediate and persistent improvement in search traffic performance.

The post The immediate results of link building appeared first on Search Engine Land.

]]>
Mobilegeddon A Month Later: Small Business Study Shows…. Nothing Happened /mobile-update-small-business-study-shows-nothing-220983 /mobile-update-small-business-study-shows-nothing-220983#respond Wed, 20 May 2015 13:32:11 +0000 http:/?p=220983 Contributor Conrad Saam shares the results of a small-scale study looking at the impact of Google's mobile friendly update on local law firms.

The post Mobilegeddon A Month Later: Small Business Study Shows…. Nothing Happened appeared first on Search Engine Land.

]]>

We’re now a month out from the launch of Google’s purportedly “apocalyptic” mobile friendly update on April 21, 2015. The result? Across the industry, many are coming to the conclusion that the hype surrounding Mobilegeddon was overblown.

This post-Mobilegeddon yawn is echoed within the local results as well. We did a concentrated study of small to mid-sized legal firms, and after carefully sifting through data — 69 law firm websites, tens of thousands of sessions, 16 days, and even a two-tailed statistical significance model — we’ve come to the very painful conclusion that:

  1. Little, if anything, happened to these small businesses.
  2. This algorithm update was less interesting than a replay of the Mayweather-Pacquiao bout or the release of another Greatest Hits Album from Queen.

With the noted exception of all of those law firms who collectively spent a small fortune to get their websites mobile friendly, it seems that the mobile friendly update was a big bellyflop in the local small business market, at least where this type of business is concerned. (Mockingbird staff worked nights and weekends in May and April pulling some legal luddites into the mobile age. Given the lack of results, we now suspect that Mobilegeddon is a hoax cooked up by a bunch of out-of-work WordPress developers!)

Details Of The Study

We analyzed natural search traffic from mobile devices across 69 different law firms — 12 not mobile optimized and the remainder mobile friendly. We didn’t include the first week of results post launch (which also showed nothing happened) due to the fact that the algorithm wasn’t fully rolled out at that point.

We ignored that week one data to account for a slow algorithm roll-out as well as the erratic traffic fluctuations that frequently occur when Google tests their algorithm changes in real time. We pulled data for week two and compared it to a benchmark of average mobile weekly search traffic across 8 weeks of pre-April 21 data.

First Attempt To Demonstrate the Mobile Change Actually Exists

We first ran a test comparing the changes in traffic before and after the 21st across the two groups. Note that we have a reasonably small sample size — and some of these small business sites get very little traffic — so small variations can yield a large percentage change. In statistical terms, this means a large standard deviation. Not exactly an ideal data set.

We dusted off our stats text and ran a t-test to assess if there was a statistical difference in the performance between the two groups. How this works: Essentially, it calculates the expected actual distribution from two samples and then determines if the actual average of the two groups is different.

The result? By conventional criteria, this difference between the two data sets would not be considered statistically significant. (If you want to geek out with the stats, the two-tailed P value was 0.7889.)

Second Attempt To Demonstrate The Mobile Change Actually Exists

Every statistician knows that if you datamine hard enough, eventually you can find something that supports your hypothesis.  (This is why I caution against marketers evaluating the efficacy of their own marketing, but I digress.) So, back to the Google Analytics we went…

Perhaps if we combined all of the traffic in each group and then compared the aggregate traffic change between mobile friendly and non-mobile friendly sites, we might see that the engineers at Google have actually been doing more than playing Segway Quiddich.

But, alas — not only was there just a 2% difference in performance between the groups, the non-mobile friendly group actually outperformed the mobile friendly group! And, if you go back to the original test above, we were just looking for a statistical difference between the two samples (not that one was higher than the other).

We have a PPC client with a non-mobile friendly site.  I spent a significant amount of time in February and March aggressively pushing him to upgrade his WordPress site to a mobile friendly theme. I hate selling aggressively, but I was pretty sure this was going to be pretty devastating to his business. David, if you are reading this, there’s a bottle of scotch in the mail to you.

The post Mobilegeddon A Month Later: Small Business Study Shows…. Nothing Happened appeared first on Search Engine Land.

]]>
/mobile-update-small-business-study-shows-nothing-220983/feed 0
Pigeon Rolled Back? Law Firm Study Says Yes /pigeon-rolled-back-law-firm-study-says-yes-205060 /pigeon-rolled-back-law-firm-study-says-yes-205060#respond Thu, 09 Oct 2014 16:00:35 +0000 http:/?p=205060 Local search marketers have been concerned about the impact of Google's Pigeon update on small businesses -- has the search giant taken notice?

The post Pigeon Rolled Back? Law Firm Study Says Yes appeared first on Search Engine Land.

]]>

Pigeon — that hastily rolled out Google algo change that impacted both local and natural results — had those of us working in or for small, localized businesses (like law firms) in an utter panic.  Early consensus among the local search geeks was that Pigeon:

  • Heavily favored the massive directories — in fact there was a lot of discussion how Pigeon may have been an overreaction to Yelp’s persistent anti-competitive whining
  • Drastically reduced the frequency of local packs
  • Reopened up local results to previously diagnosed local spam tactics

Many of us, myself included, were not surprisingly vocally critical of the mess that Pigeon left all over the SERPs. Local search Rock Star, David Mihm, delayed his annual Local Search Factors study to give us a chance to evaluate just what Pigeon had dropped.

Frankly, from my perspective, it didn’t make sense that small businesses were being marginalized in favor of mega-directories. Yet, despite all of the talk from Google about the little guys, this is exactly what was happening.

What Really Happened

I recently sat down with Gyi Tsakalakis from AttorneySync to compare notes and to evaluate the impact of Pigeon on a small subset of the heavily local, small business market: law firms.

We reviewed natural search traffic from 57 different law firm sites and compared average weekly traffic for 6 weeks (post Pigeon and pre Panda 4.1) to a benchmark of average weekly traffic for 8 weeks pre Pigeon.

In the analysis, we frankly ignored the two weeks of data immediately following Pigeon under the (correct) assumption that the results were so zany, unpredictable, and fleeting during that time period.

Pigeon Analysis

Relative changes in site traffic for 57 websites before and after the Pigeon update.

With a few exceptions, the results are spectacularly uninteresting (which in and of itself is an interesting pattern). As shown in the graph above, more than half of the sites experienced a net growth in traffic — not what I would have expected given the initial expert histrionics around Pigeon.

More importantly, about two out of every three sites saw a traffic change of less than 20% — about par for the course for these smaller sites.

We reviewed the sites at the extreme ends to try to identify any patterns. I expected to see a concentration of NAP fakers (law firms are notorious for trying to look bigger by claiming fake locations) getting hit the hardest. Not so.

One of the firms which experienced the biggest traffic growth is, in fact, an artless and flagrant geo-spammer. Turns out, there was no common thread (that we could ID) among the biggest winners and loser groups.

Interestingly, when I reran the numbers, looking at traffic for the four weeks immediately following Pigeon launch, two out of three of the law firm sites saw a drop in traffic.

Reading the tea leaves, it looks to me that after a very unwelcome arrival, Pigeon is slowly being rolled back. Anecdotally (at least in legal), we’re seeing 3-packs coming back and 7-packs replacing the 3 packs. Here’s a current example Gyi sent me this weekend (with the subject line “Pigeon Poop Begone”):

los angeles personal injury attorney   Google Search

7-packs seem to be returning to some local results.

 

Now, much of this is just conjecture. It is also quite likely that law firms run into more spammy NAP issues that your average small business, so the results may not be representative overall. Suffice it to say, I think it would be a shame for small businesses to take the brunt of an algo change, and the limited data here suggest that Google has turned around to reflect this.

The post Pigeon Rolled Back? Law Firm Study Says Yes appeared first on Search Engine Land.

]]>
/pigeon-rolled-back-law-firm-study-says-yes-205060/feed 0
Suing Your SEO: Can An Agency Be Held Liable For Poor Results? /suing-your-seo-agency-168370 /suing-your-seo-agency-168370#comments Fri, 02 Aug 2013 13:30:59 +0000 http:/?p=168370 It was only a matter of time before a lawsuit was filed against a search engine optimization agency for failing to deliver. Last week, the legal marketing industry was aTwitter (and aFacebook and even aPlus) with news that law firm Seikaly & Stewart had filed a lawsuit against The Rainmaker Institute seeking a return of […]

The post Suing Your SEO: Can An Agency Be Held Liable For Poor Results? appeared first on Search Engine Land.

]]>
Legal

Image via Shutterstock

It was only a matter of time before a lawsuit was filed against a search engine optimization agency for failing to deliver.

Last week, the legal marketing industry was aTwitter (and aFacebook and even aPlus) with news that law firm Seikaly & Stewart had filed a lawsuit against The Rainmaker Institute seeking a return of their $49,000 in SEO fees and punitive damages under civil RICO (read: mobsters and racketeering — more on that later).

To the best of my knowledge, this is the largest and most public legal imbroglio involving aggressive performance claims, angry clients, SEO agencies and black hat tactics.

To date, clients caught up in agency black hat shenanigans (JC Penney, anyone?) have expeditiously swept the news under the rug as soon as possible.

Internal marketing departments, and their close cousins in PR, are only too eager to avoid public discussion from a lawsuit that would paint them as at best entirely ignorant and at worst entirely complicit with short term black hat search practices.

Full disclosure: I know Stephen Fairly at the Rainmaker Institute and have spoken about SEO at many of his events in the past (gulp). Everyone outside of legal must understand that the industry has gone to great (and excessively overreaching) lengths in attempts to legislate away the ambulance-chasing lawyer stereotype.

Are SEOs Responsible For Outcomes?

When it comes to marketing, attorneys generally have their hands extremely tied due to restrictive regulations on legal advertising, though restrictions do vary from state to state. The introduction of online marketing into the mix has further complicated things — existing regulations do not always have clear applications for online marketing, and regulators are still in the process of putting together advertising guidelines for this new frontier.

The Rainmaker Institute has gone to great lengths to not only educate, but to encourage marketing of legal services. I stand behind the notion that marketing is good for the industry as it brings attorneys closer to people who really need help.

I commend Fairley for being instrumental in making this happen. But, did he go too far? And is this a precedent where SEO agencies are on the hook for tactics and performance in search? And, what about this racketeering thing?

RICO Racketeering?

While I don’t want to go too deeply into the legality of things, RICO (short for Racketeer Influenced and Corrupt Organizations Act) was created in 1970 to help close loopholes protecting the leaders of organized crime for the actions of their group — like a mob boss ordering a hit, for example.

Seems a far cry from link building, so I asked some lawyers: Dan Kalish from the employment firm HKM explained that Rainmaker’s actions would need to be seen as a “broad, interstate scheme, involving several victims and spanning several years.” Seth Price described the racketeering claim as “a creative attempt to get a simple contract dispute from state arbitration into federal court.” (For a full legal counterpoint, try this piece by Clay Hasbrook.)

Let’s dig into the complaint itself:

The Rainmaker Institute disclaimed all liability for lack of success of its efforts . . . The action is based on the fact that, at the time that the defendants were promoting this marketing scheme to the Victim Firms, they knew that the techniques they proposed to use were in violation of the guidelines already well-established and published by Google.

This begs the question for all agencies: does knowingly violating Google guidelines open agencies up to lawsuits? Let’s be honest, if that were the case, more than a few of my search friends would have found themselves in court already. Even really bad tactics work (for some period of time). And of course, Google’s guidelines change, and tactics become outdated.

Marketing Is A Crapshoot

Remember when boldfaced text was a genuine indicator of what a page was about? Or when (genuine) blog comments were a good authority signal? Or, more recently, over-optimized anchor text links? Marketing is still a crapshoot — TV, billboards, super bowl ads, skywriting and SEO. Holding any advertising agency responsible for what does and doesn’t work is sour grapes from a bitter client.

Everyone involved in marketing channels on the client side knows that their primary function is to assess not only the marketing channel in question, but also the projected outcomes of that channel.

My two experiments in TV advertising have been utter failures, but suing Comcast would be an asinine response. Yet, that is what S&S is doing. This is the heart of the problem for the law firm in this case — essentially, they are saying, “We bought something that was widely recognized by everyone (but us) to be ineffective.” And like the JC Penney marketing department, S&S was either extraordinarily ignorant or has some serious (but informed) buyer’s remorse.

The scorn among the legal blogosphere has been evenly meted out between plaintiff and defendant. (From my experience, the only thing blogging lawyers hold in higher contempt than meritless lawsuits are SEO consultants.) From the acrid eloquence of Scott Greenfield:

… And were Seikaly & Stewart victimized by Fairley’s unkept promises? It’s beyond ironic that a firm seeking to buy its way to prominence from a marketeer complains that it was out-deceived. It’s not that they have no cause of action, having paid a pretty sweet sum to the Rainmaker Institute and gotten bupkis in return, but that when someone seeks to game the system and got played in return, it’s just awfully hard to feel badly about the whole thing . . . Plus, it’s always fun to see the imaginative uses to which civil RICO is put.

The final obvious result, of course, is that no search agency in their right mind will ever want Seikaly and Stewart on their client roster. However, the SEO cynic in me hopes there is another angle: this could be nothing more than an extremely clever, sophisticated, premeditated (and genuine) link building exercise orchestrated by a genius law firm marketing intern.

The post Suing Your SEO: Can An Agency Be Held Liable For Poor Results? appeared first on Search Engine Land.

]]>
/suing-your-seo-agency-168370/feed 35
How To Work With Four Common SEO Leadership Styles /how-to-work-with-four-common-seo-leadership-styles-149854 /how-to-work-with-four-common-seo-leadership-styles-149854#comments Mon, 11 Mar 2013 16:34:16 +0000 http:/?p=149854 I’ve had the pleasure of working on SEO in a variety of companies both in house and, frequently, through casual advice to other companies. After eight years in the business, I’m convinced that the success of search for an in-house person hinges on their ability to work within a leader’s approach to search. This is […]

The post How To Work With Four Common SEO Leadership Styles appeared first on Search Engine Land.

]]>

I’ve had the pleasure of working on SEO in a variety of companies both in house and, frequently, through casual advice to other companies. After eight years in the business, I’m convinced that the success of search for an in-house person hinges on their ability to work within a leader’s approach to search. This is due to the interdepartmental cooperation needed to effectively run search as well as the fluid nature of our business.

What follows are four extreme leadership caricatures, vis-a-vis search, and recommendations on how to most effectively work search into an organization within a specific leadership perspective regarding search leadership styles.

Hands Off Leaders

Hands Off_shutterstock

Image via Shutterstock

The hands-off leader is usually old school and has been successful either prior to the advent of the Interweb or through other online channels such as highly measureable PPC, email or display. Mr. Hands Off probably still has an AOL email address for personal use.

In these organizations, search probably doesn’t exist as a function, and if it does, it’s probably grouped under an entry level PPC or display person. The Hands Off leader has either never decided to push search as a channel or worse (and more likely), has proactively chosen to steer clear of it.

How To Manage Hands Off

This is actually a great opportunity for a search marketer – as the Hands Off leader probably has built a successful business ripe with low hanging search fruit. Your challenge, here, is to get some early easy wins and report metrics up as widely and as high as possible.

Use business metrics such as ROI, cost of sales, cost of acquisition etc., to compare search against other channels. Search will (almost) always outperform.

The biggest challenge in a Hands Off organization is the cross-departmental cooperation that must happen for most search efforts to be effective.

Conquer this by over communicating and always awarding the success of those aforementioned business metrics to other departments. It’s amazing how you can make friends by turning cost centers into profit centers.

Confidently Ignorant

Boost Web Traffic

Image via Shutterstock

The Confidently Ignorant leader is a garbled mess of buzzwords and directives. This leader is usually someone who has worked at, but not on an online property and wouldn’t know the difference between a canonical tag and a magic wand.

While she speaks confidently of GoogleJuice, the nuance between linkbait and spam is a mystery to Confidently Ignorant. You’ve never heard her say, “I don’t know,” but she frequently spouts off on dated information she picked up at a networking event that was trendy months or years ago. “We need more Pinterest Juice!”

Usually, nothing substantial ever really happens under Confidently Ignorant.

How To Manage Confidently Ignorant

The good news is that Confidently Ignorant knows that search is important – just not what to do about it. The primary challenge, here, is to focus on those things that do move the needle and minimize the garbage. Use Confidently Ignorant’s bluster to gather resources across departments to push through your search agenda.

Managing in this organization requires a disciplined approach to project management, i.e., identify business objectives before a project is undertaken and then regularly go back and post-mortem every project. This provides you with a structure to demonstrate (privately) that Confidently Ignorant’s  pet project that she crammed through wasn’t a great investment.

In fact, over time, these post-mortems, may serve to simultaneously educate and bore her with the technical details of search. Victory. Unlike every other situation – ascribe search success directly to Confidently Ignorant (instead of the departments who actually did the work) to garner further support for projects you want to undertake.

Aspiring Growth

Aspiring

Image via Shutterstock

The Aspiring Growth leader often runs a start-up, is confident that his product is desperately needed and is certain that the Interweb will deliver a flood of business; he’s just not sure how. Aspiring Growth has picked up a few things here or there, but knows what he doesn’t know. He is admittedly (and appropriately) focused on his area of expertise (usually building out a product that people don’t even know they need yet.)

Additionally, he has limited understanding of the time and competitive factors around search – i.e., it’s going to take a lot more time and resources to get traffic for “mesothelioma lawyer” than “pink fuzzy bunny slippers.” He probably thinks of search in terms of ranking reports.

How To Manage Aspiring Growth

Aspiring Growth needs a lesson in reality. I’ve found that the best way to work with Aspiring Growth leaders is to overwhelm them with education around the technical and tactical components of search, communicating this early on and often. This does two things:  1) grounds them in reality that search isn’t an immediate payoff, and 2) bores them with the details. This provides them with the confidence to let you do your job, while they return to what they do best.

Aspiring Growth is highly susceptible to unscrupulous or just really bad search and Web development agencies, and you may find yourself spending most of your effort unraveling legacy issues that were offshored.

SEO Maven

test

Image via shutterstock

The SEO Maven leaders are few and far between. She’s probably has a high-end technical degree and sold her first business, that she coded entirely on her own.

The SEO Maven has inculcated search best practices across the organization, does nothing at all to the site without A/B/C/D/E . . . testing, and has created some impressive in-house reporting tools that she replaced Google Analytics with because she’s paranoid about sharing site data. She could probably sell these tools to the agency world, but would never consider it as she’s too focused on using them to build her business.

How To Manage the SEO Maven

This is both a hard and easy place to be as an in-house search marketer. First, there is no way you are going to be the search subject-matter expert, so eat a humility pill every day at breakfast. However, working for the SEO Maven minimizes the political battles and interdepartmental struggles that so frequently occur otherwise when dealing with search.

In SEO Maven organizations, “what have you done for me lately” is the overriding perspective. Test, test, test. Be creative and then test again. These businesses are the ultimate training grounds for search marketers – if someone was getting out of college who wanted to get into the business, this is where I’d recommend they go (especially instead of the agency route.)

Due to the connected nature of our business, highly political corporate cultures tend to flounder in their search efforts while focused, nimble companies often succeed. While we may bemoan this reality, as search marketers, learning to function within the leadership and political structure of our organization is a necessarily skill.

The post How To Work With Four Common SEO Leadership Styles appeared first on Search Engine Land.

]]>
/how-to-work-with-four-common-seo-leadership-styles-149854/feed 6
Dealing With People – The Hardest Part Of SEO /dealing-with-people-the-hardest-part-of-search-134431 /dealing-with-people-the-hardest-part-of-search-134431#comments Wed, 10 Oct 2012 13:33:53 +0000 http:/?p=134431 The really hard part about search engine optimization isn’t the SEO itself but dealing with people within your organization who have (or should have) an impact on SEO. Optimizing H1s is easy. Dealing with people is hard. Really hard. What follows are a series of the most important lessons I’ve (maybe) learned while dealing with MBAs, […]

The post Dealing With People – The Hardest Part Of SEO appeared first on Search Engine Land.

]]>
The really hard part about search engine optimization isn’t the SEO itself but dealing with people within your organization who have (or should have) an impact on SEO. Optimizing H1s is easy. Dealing with people is hard. Really hard.

What follows are a series of the most important lessons I’ve (maybe) learned while dealing with MBAs, devs, product managers, designers and my own ego.

Don’t Think Everything Through

We’ve been trained through years of education, higher education (and for some of people,  worse – consulting gigs) to deliver beautifully polished, thoroughly thought-through project plans. We wrap these plans in detailed reports and colored coded timelines and then present them to key stakeholders. Nothing could be less effective.

These reports are inevitably met with questions you haven’t thoroughly thought through (or even considered), staffing limitations, technical impossibilities, competing priorities and political agendas. Avoid this by involving stakeholders as early as possible and having hard, messy conversations early in the process instead of at the end.

Don’t Design Anything

It’s very easy to have opinions on design. Remember what they say about opinions and leave your designers alone to be the professionals that they are.

Don’t Trust Salespeople Who Describe Easy Implementation

I’ve been doing this for 15 years, and I’ve never seen an easy, seamless implementation. Get your technical people talking to their technical people before you sign the contract. Good salespeople will push this for you. Bad salespeople will send you a contract first.

Don’t Assume Noobs Understand Anything

I once watched a new hire struggle for weeks until we had a coffee conversation about search and it became clear he didn’t know the difference between a title tag and an H1 and thought a sitemap was something optometrists used.

Bad hiring process for sure – but you can minimize this problem by pushing new hires through a search orientation, in which you lay out search fundamentals as well as your company’s overall search philosophy.

Don’t Think About Traffic

Ahhh the monthly UU count – that standardized pissing contest yardstick by which all sites are compared. The UU count is a hold over of the ad supported model. It (frequently) means nothing to your company’s business health.

Unless you are pushing Sealy Posturepedic display campaigns or have no ethical qualms in targeting poor college students with Chase Bank credit card ads,  UU’s are probably not the right yardstick for your business.

A retailer, for example should much rather increase converting traffic by 10% than doubling non-converting traffic. This has huge implications for your search strategy.

Don’t Hire People With A Blackhat Background

This is an unfair, broad-brush-stroke generalization, but people who have worked in a “we’re smarter than those massively capitalized search engines” mode have trouble leaving this perspective behind. This is true for both in-house and, even worse, consultant SEOs.

Yes, there are some (in fact many) of the search celebrities who once made tons of money pushing Viagra from Canada who have now been reborn as virginal white hats, but I fear an arrogance that just can’t be left behind.

Don’t Let People Who Did A Project Evaluate It

We have access to more data than we know what to do with. This means that with some good data mining, it’s rare that you can’t shape an analysis in whatever light you want to in order to impress your genius upon a boss.

Avoid this problem by clearly calling out success metrics, various data sources, and the evaluation time period at the onset of the project. Better yet, have all analysis performed by a dedicated, disinterested number cruncher. Speaking of which . . .

Don’t Hire Optimistic Analysts

Optimists make good cheerleaders and visionary CEOs – they make really poor analysts and financial prognosticators. We recently hired a full time analyst who is downright cranky – suspicious of all assumptions, critical of growth multiples, and highly skeptical of any hockey-stick graph that would put a smile on a VC’s face. Best hire ever!

Don’t Follow Ranking Reports

I’m still seeing business leaders obsess over ranking reports and search “tools” that sate the desire to compare rankings for specific terms despite personalization, social, constant changing SERP pages and the fact that this term-based focus can lead you down a very dangerous path.

One cranky day, I wrote a diatribe against ranking reports: Excuse me While I Have a Ranking Report Rant.

Don’t Use Consultants

I’ve made many unfriends (and been uninvited to a few conferences) for pushing businesses away from search agencies.

Ignoring the proliferation of hacks masquerading as search gurus, here’s why search should be an in-house function:

  • Given the potential downside, I’d only hire a consultant if I knew exactly what they were doing, and if I knew exactly what they were doing, I’d do it myself.
  • SEO touches so many parts of the organization – this is very hard to deal with as an in-house, and it’s almost impossible for an offsite third party.
  • SEO changes constantly, so it is vital to have search fundamentals baked into your ongoing marketing, content, and development cycles.
  • SEO done well is a constant process of testing and retesting and incorporating the learnings from these tests into the mental memory of the organization. This just can’t be accomplished from the outside.

Good reasons to use consultants:

  • If you really don’t know what you are doing and you want to bring someone in to train your staff (or to keep your staff up to speed on the cutting edge issues: “how should we think about the next iteration of Penguin?,” for example).
  • If you are doing a highly technical, infrequent change – like a complete change in back end infrastructure.
  • Getting a third party audit coupled with in-house training may serve to uncover some things your in-house group hasn’t thought of.
  • If you need someone to throw under the blackhat bus – some companies want an agency on record to take the fall when they are inevitably penalized. Remember JCPenney, anyone? SearchDex, the agency they threw under the bus, quickly pulled down their page listing their clients and now, almost two years later, still won’t ID a single client on its website.

Don’t Benchmark Competitors

Trying to emulate competitors will push you to be just as bad as they are. Instead, benchmark best practices (or creative approaches) outside your industry. I’ve written more about this here:  Aspirations of Incompetence:  Benchmarking Competitors.

Don’t Take Credit For Any SEO Successes

Bill Gurly from Benchmark Capital described search traffic to me as “free beer.” As the in-house SEO, you are in a unique position to apportion this keg of free beer across the organization. Nothing engenders more buy-in for the importance of search than public attribution for success.

So, as an in-house SEO, your golden rule is to never ever ever take any credit for anything good that ever happens with regards to search.

The post Dealing With People – The Hardest Part Of SEO appeared first on Search Engine Land.

]]>
/dealing-with-people-the-hardest-part-of-search-134431/feed 9
How Scattergraphs Can Be Your Best Friends /how-scattergraphs-can-be-your-best-friends-130116 /how-scattergraphs-can-be-your-best-friends-130116#comments Mon, 13 Aug 2012 14:43:18 +0000 http:/?p=130116 Recently, I was on an in-house SEO panel at SMX with REI’s Jonathon Colman. Most of the audience’s questions centered around explaining and reporting relevant metrics to upper management. Turns out, while search has come a long way, many execs still use terms like “Google Juice” and define success as launching a PPC campaign to […]

The post How Scattergraphs Can Be Your Best Friends appeared first on Search Engine Land.

]]>
Recently, I was on an in-house SEO panel at SMX with REI’s Jonathon Colman. Most of the audience’s questions centered around explaining and reporting relevant metrics to upper management.

Turns out, while search has come a long way, many execs still use terms like “Google Juice” and define success as launching a PPC campaign to “rank number 1 for our competitor’s name”. This issue is even more pronounced in larger, established companies where search makes up a smaller portion of the marketing mix.

:::sigh:::

Jonathon’s primary recommendation centered around “data visualization” – explaining and reporting on search concepts (and progress) through pictures instead of technical jargon and theory.

To the extent that you can translate your SEO efforts into picturebooks for MBAs via powerpoint, you can successfully focus those with limited search understanding on the correct tactics.

Enter Scattergraphs

What we are all really trying to do is develop a clear understanding of “if I do X, then Y is going to happen”.

In mathematical terms, this is called a correlation coefficient – i.e. the extent to which two series of datapoints are interrelated. Correlation coefficients range from +1 (perfect positive correlation) to -1 (perfect negative correlation).

This can get infinitely more complex when you add more than two datapoints, the analysis is a statistical methodology called multiple regression analysis in which you try to determine the extent to which multiple data points impact a variable.

This is the process undertaken by some search consultancies and tool providers who try to use data to backdoor their way into search engine algorithms. Multiple regression analysis is a hairy process, involves words like heteroscedasticity and requires either an advanced degree in statistics or econometrics to do with any degree of accuracy. I stay away.

One note of caution:  correlation does not mean causation. Just because the two datapoints have a similar pattern, doesn’t mean one influences the other. An  obvious example of this is sunrise and eating breakfast . . . while these things often happen in synch, eating your Cheerios at 4 am will not make the sun rise any earlier.

Simple regression, in which we are just looking at fit between two data points is, in fact, pretty easy stuff.  The concept is fairly simple – calculate a straight line that best fits two data points when plotted on a graph. If you want to geek out on the math behind this try the Simple Linear Regression page on this awesome site I just found called Wikipedia.

Here’s a visual explanation of correlation coefficients and simple regression:

(Obviously, this is not my graphic – do you think I’d deliberately highlight a negative correlation between hair and time?)

If you’d actually like to do it instead of remembering the greek symbols behind math formulas . . . use good old Excel. Here’s how:

1.  Select Two Datapoints

While you can calculate correlate between all sorts of things, may I suggest starting with inbound natural search traffic and some variable that theoretically impacts that?

To get multiple datapoints, you’ll need to segment your data – in the case of Urbanspoon, it’s pretty easy – we can look at traffic by city, cuisine type, or entry categories (restaurant pages instead of city pages for example).

Now, normalize that data:  if you are looking at differences by geography, calculate penetration by dividing your entry sessions by population; if you are looking at differences by product category, calculate penetration by dividing by overall search impressions. (Depending on your data sources, this normalization process can be persnickety and tricky.)

2.  Open Excel

Put your two datapoints into two excel columns.

3.  Correlation Coefficient

Calculate the correlation coefficient between the two columns using the CORELL command. This will give you the mathematical correlation coefficient indicating the extent to which those two datapoints are correlated – the closer to 1, the more tight positive correlation, the closer to -1, the more tight negative correlation. Correlation coefficients close to zero indicate no correlation.

4.  Turn this Number Into a Picture

Use excel to create a scattergraph of these two columns like the ones above. I like to put the natural search penetration on the vertical axis and the tactical variable on the horizontal axis. Assuming there is a correlation . . .

5.  Impact the Variable

Engage in whatever tactic are analyzing by selecting a few of the datapoints that are underperforming (i.e. for positive correlation, these datapoints will exist in the bottom left hand quadrant of your scattergraph.) This tactic can be linkbuilding or social mentions for example. Your goal is to move the datapoint along the horizontal axis and see if it also moves up the vertical (penetration) axis.

6.  Wait

How long you wait depends on what tactic you are using and how quickly (theoretically) you think it’s going to take for the tactic to have an impact.

7.  Redraw the Scattergraph

Now, after you have a new set of data, redraw your scattergraph. Highlight those variables in a before and after comparison of the scattergraphs and demonstrate to your MBAs the extent to which movement along the horizontal axis is reflected in movement up the vertical axis. Highlight this movement with arrows or different colors for your test datapoints.  Y

ou can even redraw both data grabs using different colors on the same graph, or show a simple before and after.

8. Declare Success or Failure of Tactic

The result being to roll out your effort more broadly or abandon the tactic altogether.

This gives you a real way of calculating the impact of your tactics. If you have cost metrics (and you should), you can transcend discussion of GoogleJuice (yummy, I like mine on ice) and make ROI driven investments in search.

The post How Scattergraphs Can Be Your Best Friends appeared first on Search Engine Land.

]]>
/how-scattergraphs-can-be-your-best-friends-130116/feed 2