Jonah Stein, Author at Search Engine Land News On Search Engines, Search Engine Optimization (SEO) & Search Engine Marketing (SEM) Thu, 03 Mar 2022 20:49:01 +0000 en-US hourly 1 https://wordpress.org/?v=5.9.2 Lessons From Google On Optimizing Your SEO /optimizing-seo-strategy-2014-beyond-183806 /optimizing-seo-strategy-2014-beyond-183806#comments Mon, 17 Feb 2014 14:13:01 +0000 /?p=183806 Dan Cobley, Google UK’s Managing Director, recently revealed that Google’s infamous 2007 “50 Shades of Blue” experiment involving ad links in Gmail increased revenue by $200 million a year. These results switched the balance of power from design-driven to engineering data-driven decisions, and famously led Google’s top designer, Doug Bowman, to ultimately to resign in frustration. […]

The post Lessons From Google On Optimizing Your SEO appeared first on Search Engine Land.

]]>
Dan Cobley, Google UK’s Managing Director, recently revealed that Google’s infamous 2007 “50 Shades of Blue” experiment involving ad links in Gmail increased revenue by $200 million a year. These results switched the balance of power from design-driven to engineering data-driven decisions, and famously led Google’s top designer, Doug Bowman, to ultimately to resign in frustration.

Search professionals have since observed and reported hundreds of UI experiments, and the search engine results page has gone through dozens of iterations since the days of 10 blue links.

Universal Search, embedded local with the 3-pack, 5-pack and 7-pack variations, rich snippets, authorship, the knowledge graph, and of course the carousel have all become part of the SEM lexicon. You can be confident that all of these changes were A/B (or multivariate) tested and vetted against some conversion goal.

So what does this have to do with SEO?

How Does Google Measure A Conversion From Organic?

If every change to the presentation layer is driven by conversion optimization, it is reasonable to assume that organic rankings are also informed by the same approach. The dilemma is that we do not know what criteria Google is using to measure a “conversion” out of organic results.

The first concrete example of Google using user data to influence SERPs appears in 2009, when Matt Cutts revealed that Google site links are partially driven by user behavior. This tidbit surfaced during a site clinic review of Meijer.com at SMX West when he noted that their Store Locator is buried in their primary navigation, but that it is a popular page “because it appears in your site link.”

The Meijer Store Locator is the top choice in the site links, despite the fact that it is still buried in the global navigation and the page is essentially devoid of content.

Meijer's Site Link

A quick look at the AdWords Keyword Tool shows that “meijer locations” is the 12th most popular query, well below “ad” (#2) & “weekly ad” (#8), “mperks” (#4), and “pharmacy” (#6). Even if you were to group these keywords by landing page, it is difficult to imagine behavior in the SERPs that would elevate the Store Locator to the top of the sitelink.  The only thing maintaining the prominence in the sitelink appears to be user behavior after the query.

The Big Brand Bailout

The second confirmation of user behavior affecting ranking came with the Big Brand Bailout which was first observed in February of 2009: big brands started magically dominating search results for highly competitive short-tail queries. Displaced site owners (many with lead gen sites) screamed in protest as sites with fewer backlinks suddenly vaulted over them. Google called this update Vince; @stuntdubl called it the Big Brand Bailout, and that is the name that stuck.

Hundreds of pundits suggested how/why this happened. Eventually, Mathew Trewhella, a Google engineer who was not yet trained in the @MattCutts School of Answering-Questions-Without-Saying-Anything-Meaningful, slipped up and revealed during a SEOGadget Q&A session that:

  1. Google is testing to find results that produced the least subsequent queries, from which we conclude subsequent queries are a “conversion failure” when testing organic results. Matthew said the Vince update was about Google minimizing the number of times people have to search to find the products or information they are looking for. Every time a user has to perform a second search, Google regards it as their failure for not bringing up the right result the first time.

  2. Google is using data on users’ subsequent query behavior to improve SERPs for the initial query and elevating sites for which users are indicating intent later in the click stream. In other words, Google is using user behavior to disambiguate intent and influence rankings on the original query in an attempt to improve conversions.

Learning From Panda

The third example of user engagement data affecting search results came with Panda. Many parts of the Panda update remain opaque, and the classifier has evolved significantly since its first release. Google characterized it as a machine learning algorithm and hence a black box which wouldn’t allow for manual intervention.

We later learned that some sites were subsequently added to the training set as a quality site, thus causing them to recover and be locked in as “good sites.”  This makes it especially hard to compare winners and losers to reverse engineer best practices. What most SEO practitioners agree upon is that user behavior and engagement play a large role in the site’s Panda score. If users quickly return to the search engine and click on the next result or refine their query, that can’t be a good signal for site quality.

What does Google’s conversion testing mean for SEO?

Putting These Learnings To Use

Google’s announcements are often aspirational and seemingly lack nuance. They tell us that they have solved a problem or devalued a tactic and SEOs quickly point to the exceptions before proclaiming the announcement as hype or FUD. Years later, we look around and that tactic is all but dead and the practitioners are toast (unless the company is big enough to earn Google immunity). These pronouncements feel false and misleading because they are made several iterations before the goal is accomplished. The key to understanding where Google is now is to look at what they told us they were doing a year ago.

What they told us 18 months ago is that the Search Quality Team has been renamed the Knowledge Team; they want to answer people’s search intent instead of always pushing users off to other (our) websites. Google proudly proclaims that they do over 500 algorithm updates per year and that they are constantly testing refinements, new layouts and features.

They also allude to the progress they are making with machine learning and their advancing ability to make connections based on the enormous amount of data they accumulate every day. Instead of the Knowledge Team, they should have renamed it the Measurement Team, because Google is measuring everything and mining that data to understand intent and provide users with the variation they are looking for.

What does this mean to site owners?

Matt Cutts told us at SMX Advanced in 2013 that only 15% of queries are of interest to any webmaster/SEO anywhere. Eighty-five percent (85%) of what Google worries about, we actually pay no attention to. An update that affects 1.5% of queries can affect 10% of queries some SEO somewhere cares about and 50% of the top “money terms” on Google.

Simultaneously, Google tends to roll out changes and then iterate them. The lack of screaming protests or volatile weather reports suggest that very few results actually changed when Hummingbird was released — at least, results you can view in a ranking scraper. Instead, Google rolled out the tools they need to make the next leap in personalization, which will gradually pick winners and losers.

A Third Set Of Signals

SEO has long focused on onsite and offsite ranking signals, but the time has come to recognize a third set of signals. Google conversion testing within the SERPs and user interaction signals are becoming more and more important to organic ranking. Let’s call this third set Audience Engagement Signals.

The good news is that this paradigm provides a significant chance for onsite changes to improve performance and generate strong positive Audience Engagement Signals. Machine Learning is data driven and Audience Engagement Signals, like clicks, shares, repeat visits and brand searches, are measurable user actions. Site owners that embrace user-focused optimization and align their testing goals with what we can reasonably infer to be Google conversion metrics (instead of our own narrowly defined conversion goals) are likely to improve audience engagement.

That is how to optimize your SEO strategy for now, for next year, and for the foreseeable future.

The post Lessons From Google On Optimizing Your SEO appeared first on Search Engine Land.

]]>
/optimizing-seo-strategy-2014-beyond-183806/feed 8 Screen Shot 2014-02-07 at 11.09.49 AM
You Don’t Have To Be Nuts To Worry About Changing Your Domain /you-dont-have-to-be-nuts-to-worry-about-changing-your-domain-111957 /you-dont-have-to-be-nuts-to-worry-about-changing-your-domain-111957#comments Mon, 20 Feb 2012 14:37:02 +0000 /?p=111957 Enterprise SEO is all about mitigating risk. Slow and steady, fix what is broken, don’t let anyone do anything radical chasing the latest fads, don’t push the envelope into anything black or even grey and keep your IT department from inadvertently destroying your rankings. So what do you say when a large, branded site wants […]

The post You Don’t Have To Be Nuts To Worry About Changing Your Domain appeared first on Search Engine Land.

]]>
Enterprise SEO is all about mitigating risk. Slow and steady, fix what is broken, don’t let anyone do anything radical chasing the latest fads, don’t push the envelope into anything black or even grey and keep your IT department from inadvertently destroying your rankings.

So what do you say when a large, branded site wants to go about changing a well-established domain?

For the last 6 years or more, moving a website from one domain to another has been fairly straightforward and low risk endeavor. Set up a 301 redirect that maps all of your old URLs to your new ones and then sit back and wait 1-2 weeks while the search engines crawl all of your primary URLs and you were good to go.

If you wanted to speed things along a little, you could also do a change of address in Webmaster Tools and submit critical pages through Fetch As Googlebot.

Sure, Bing and Yahoo would take longer to update and, yes, it might take months for all of the pages in the supplemental index to finally clear out, but in most cases, your site was above 80% of your previous traffic levels in a month and returned to previous levels within 60 days of your switch.

Re-branding was not painless, but as long as you were simply moving from one domain to another the results were predictable, the process was well defined and the risks were minimal.

If you took it as an opportunity to thoroughly audit your indexation and clean up some legacy issues, it could even be the foundation for some significant gains in your overall traffic.

This no longer appears to be the case; the experience of Nuts.com illustrates that there is now a greater risk when changing domains, especially for older, more established sites.

Case In Point

Changing your domain is no longer certain to be painless or low risk.

As a consequence of rebranding from NutsOnline.com to Nuts.com, The Newark Nut Company is losing thousands of dollars in revenue every day.

The Newark Nut Company is a multi-generational 83 year old family business started in an open air market in New Jersey at the beginning of the Great Depression. The company grew into a brick and mortar store with a warehouse and a mail order business for loyal customers.In 1999, one of the grandsons of the founder decided to try e-commerce and launched NutsOnline.com.

Through hard work, personality, and great customer service, the company became one of the premier online retailers of nuts, seeds, and bulk foods, as well as a category leader for decorative candies such as Jordan Almonds. In other words, the poster child for a family business going online and succeeding with a quirky persona and a commitment to quality.

Preparing To Move

Even though changing domains is usually straightforward, I am a data junkie and I favor a methodical approach that allows me to track the progress of the re-indexation, as well as monitoring rankings and traffic.

Prior to the move, site traffic and rankings were very healthy. During the seven month period prior, they averaged 30,000-44,000 visits each week from organic Google searches with traffic steadily rising 5-10% each month.

Every client engagement begins with an audit to discover and correct canonical issues, duplicate titles and descriptions, spiderability problems and any other technical problems that may be hindering rankings; in the case of a migration, this is especially important because we can address the issues before the move to eliminate potential variables and uncertainty.

The first step was to determine how many canonical pages the site contains and to build a sitemap that included those URLs and only those URLs. This provides a baseline to measure progress and be the first metric to gauge the indexation of the new site.

In the case of nutsonline.com, we determined that we had about 4,800 core content and product pages, 365 pages in the blog, about 500 tag pages, and 3,250 images that we wanted indexed. This works out to less than 6,000 pages to monitor. Before the move, over 98% of our core pages were being indexed in the sitemap for nutsonline.com.

While the target number of pages was under 6,000, the site: query for Nutsonline.com, depending on what data center we hit, showed between 198,000 and 245,000 pages in the index.

Some quick digging around found the usual suspects:

  • About 100,000 URLs indexed at site:nutsonline.com/search
  • About 60,000 URLs indexed at site:nutsonline.com/tag
  • About 20,000 URLs indexed at one of the following subdomains:  cdn.nutsonline.com, staging.nutsonline.com or https://www.nutsonline.com
  • About 10,000 URLs indexed with parameters such as GCLID, Sort, SID, item, source or department.

This list gave us an excellent starting point for changes we needed to make to the site during the transition in order to right-size our indexation footprint.

In a perfect world, we would have waited until the site was pruned to the correct canonical URLs before we changed the domain. It might have taken months for Google to clean out the extra URLs from the index.

Despite the obvious canonical problems, Google had no difficulty returning the correct pages in SERP. Of the top 500 landing pages for organic traffic, not a single one was a non-canonical version, and of the 645,000 organic entries in Q4 of 2011, only 464 were on non-canonical URLs.

Based on this, we decided to procede.

  • We implemented “rel=canonical” throughout the http://nuts.com implementation to resolve the tracking and display parameters, since blocking them in Webmaster Tools parameter settings was not removing them from the index.
  • We redirected all of the sub-domain on nutsonline.com to http://nuts.com.
  • We put staging behind a password and used robots.txt to block http://cdn.nuts.com.
  • We decided that the search results were not something we wanted in the index (despite generating 1,395 organic entries in Q4), so we added follow, noindex to the header of the search results template.
  • Finally, we reviewed all the the tag pages and discovered how different navigation paths could generate 4, 5 and 6 level deep tag taxonomies with elements in different orders, creating a canonical nightmare.  We added robots meta noindex, nofollow to all tag pages that were more than 1 level deep.

The Big Switch

On January 6th, 2012, we implemented a global page-to-page 301 and did a change of address in Google Webmaster Tools.

Within 2 weeks, almost all of the queries and impressions on NutsOnline.com had disappeared. These Google Webmaster Tools charts very clearly show the drop off in impressions for nutsonline.com and the pick up of impressions for nuts.com.

NutsOnline.com

Nuts.com

Within a week, over 98% of our canonical URLs on nuts.com were indexed (4408 of 4484 in the primary sitemap) and 95% of the Google organic traffic was gone from NutsOnline.com.

The Big Drop

Unfortunately, 2 weeks after the transition, overall Google organic traffic for nuts.com was down over 70% and rankings were down across the board, much as though the site was hit by Panda or some form of penalty.

The third week, January 23rd to the 31st, showed promising signs of a recovery, reaching almost half of the pre-change traffic levels on the 31st. Instead of continuing to recover, however, traffic headed down again as if it were once again weighed down by Panda or some related algorithm.

It is very difficult to compare week over week traffic and account for seasonality.

It is noteworthy, however, that the week just before the migration (January 1, 2012 to January 7, 2012) represented the single best week for Google organic traffic in the history of the company.

The chart below shows incontrovertibly that traffic took a dive as a consequence of the change of address.

Ranking reports are generally not as valuable a tool as they once were, but of the 81 keywords I track for the baseline report, 19 of 25 top ranked terms dropped, along with 26 of 41 top 3 spots. In all, 39 terms have simply dropped out of the top 50.

The most important (lucrative) keyword used to return as #1 with a sitelink with 4 entries and almost 10,000 visitors a week. For that same term, the site now fluctuates between 14 and 20.

The domain nuts.com was used previously and the Webmaster had acquired a spam penalty. My client purchased the domain in October of 2011, registered it with Google Webmaster Tools and submitted a reconsideration request that detailed the history.

The domain was reviewed and we were told that the penalty had been lifted.

Despite this assurance, 17 days after the switch over and 10 days without any significant ranking improvement, we theorized that we may be suffering from some legacy penalty against the nuts.com domain and submitted a reconsideration request. A few days later we were informed that no manual penalties existed against the site.

Why Has Google Forsaken Us?

NutsOnline.com has been operating almost as long as Google has been in existence. As such, they had a domain history, trust, and other social signals to rank well despite whatever issues may have been sub-optimal.

As soon as the site switched domains, we found ourselves in a position where Google is not ranking the new site the same way as the old. I believe we are suffering from the loss of domain history and trust that accompanies a change of address.

This has sufficiently weakened the domain strength that we have been pushed over some penalty threshold. Nothing on the site has changed significantly since the switch but we are no longer the old, crusty domain that has earned trust. We also lost all our our social signals, including thousands of Likes, Tweets, etc.

You May Be Nuts To Try This

Regardless of the underlying cause, the bottom line is that changing your domain is no longer painless or low risk.

As a consequence of changing their domain, The Newark Nut Company is losing thousands of dollars in revenue every day. The options going forward (apart from Google recognizing this is unintentional and somehow fixing the issue) are all less than ideal.

  • We could reverse course and redirect the search engines to nutsonline.com instead of nuts.com.
  • We could implement cross domain rel=canonical on nuts.com to nutsonline.com while still using Nuts.com in marketing materials and for PPC. Not only would this be a terrible example of engineering for search engines, it would also create tremendous cost, confusion, and potentially loss of trust from our actual customers.
  • We could roll back for Google only. Bing seems to be delivering about 80% of traffic we got before the 301 and they are gradually getting better. Bing reportedly doesn’t honor cross domain rel=canonical in any event. This would create no less cost or confusion, but it might be slightly less damaging.
  • We could embark on a massive link building campaign and hope that new links will over power whatever is holding us down but high quality, organic link building takes significant time.
  • Face Mountain View and Pray.

Google went to great pains to develop tools and educate webmasters who want to change domains.

As a result, many companies have succeeded in changing their domain and lived to tell about it. Now, however, with the ever increasing emphasis on brand and the indirect benefit of social signals, it appears that search engines do not have a mechanism to transfer the complete history of the domain, not just its PageRank.

Be warned, if your enterprise is planning to rebrand, you may find yourself swimming against the tide and desperately trying to avoid getting swept out to sea in what started as a “simple” change of address.

The post You Don’t Have To Be Nuts To Worry About Changing Your Domain appeared first on Search Engine Land.

]]>
/you-dont-have-to-be-nuts-to-worry-about-changing-your-domain-111957/feed 14 now-just-nuts-dot-com just-nuts-dot-com GMT_Nutsonline GMT_Nuts_com GA_nuts AWR_nuts
Guest Opinion: Is Google’s Privacy Move Really An Anti-Competitive Practice? /peering-behind-googles-privacy-screen-98707 /peering-behind-googles-privacy-screen-98707#comments Thu, 27 Oct 2011 21:29:34 +0000 /?p=98707 Fresh on the heels of a free pass from a befuddled congress after admitting that they are a monopoly, Google’s decision to cloak search query strings under the guise of privacy makes it clear they are doubling down on their abusive, anti-competitive practices. Consider the following points: Cloaking the referrer and the query string it […]

The post Guest Opinion: Is Google’s Privacy Move Really An Anti-Competitive Practice? appeared first on Search Engine Land.

]]>
Fresh on the heels of a free pass from a befuddled congress after admitting that they are a monopoly, Google’s decision to cloak search query strings under the guise of privacy makes it clear they are doubling down on their abusive, anti-competitive practices.

Consider the following points:

  1. Cloaking the referrer and the query string it contains severely hampers publishers and competing ad networks’ ability to monetize site traffic while Google gets to keep all the data and use it to target. The Google display network already had an enormous advantage over competitors. Google is using their monopoly position in search to further an unfair competitive advantage in display.
  2. The lack of query data reduces the value of the page view to the publisher. Regardless of the monetization strategy, losing query data has immediate and long-term impacts on the publisher’s ability to make money and to improve their content.
  3. Hiding the referrer of the visitor alters the fundamental operating agreement that has been in place between user and content generators, especially for ad supported content provides. Users have shown they don’t want to pay for content directly, so an entire ecosystem has been built around monetizing their “eyeballs” and their intent.
  4. Reducing the value per page view in a small amount does enormous damage to the ecosystem. Content producers both large and small operate on thin margins and changes in the revenue per page view changes those calculations. Sadly, while Google crushes the competition, they only make pennies for every dollar of value they destroy.

It is complete hypocrisy.

  • Google has been leading the charge towards personalization of search results for more than four years, but they have effectively made it impossible to personalize the landing page or the ads on the page to match the search intent of the user. Unless, of course, you are using Google’s ad network to monetize.
  • Remember Admob’s howls of protest about how Apple was effectively barring third party ad networks from IOS by restricting the usage data the app can collect and return to ad vendors? Why is this any different than blocking query strings?
  • Google+ has made it clear that they want real names and they strive to become the trust, reputation and identity hub of the Web. At bare minimum, Google is striving to significantly improve its understanding of who we are, what we believe and what are interests are. Amazingly, they are going to collect all this information while hiding everything about us from the sites we choose to visit.

    NOTE: After this was published, Google expressed a concern to Search Engine Land that it wasn’t “hiding all information,” which the editors passed on to me.

    To be fair, or at least precise, Google will continue to provide a referrer along with the logged in visitor but are removing the query string for that particular visit.

    While the query string and the referrer header are not the same thing, for a publisher, an ad server or a search marketer the query string is the most valuable piece of information in the referrer data.

    Of course most visits are from non-logged in users, so they still contain keyword data, at least for now.

Take a minute to bathe in the sea of “reassuring” statistics about how this will only affect 1-2% of searches (currently conducted in https) or perhaps 10-12% (searches by logged in users). Google is doing a full court PR press telling people that this won’t be too painful because we are only screwing you 10% of the time; what will they say when it hits 20% or 50%?

They don’t care because they expect webmasters not to notice as the water boils and competing ad networks to evaporate and hey, they are a monopoly and we are powerless to stop them. They certainly don’t care if a few sites disappear in the process because Google doesn’t see an ecosystem; they see an infinite number of sites that want their traffic.

Queue the usual argument about how publishers can opt out of Google traffic (even though they admittedly have a monopoly on search). This is, of course, complete BS. Google is telling us (again), that if we don’t like their rules we can go home; they are the dungeon masters and it is their dungeon.

Is “Privacy” The Real Deal?

OK, so lets talk about privacy. Online privacy is an important issue, one that always ranks near the top, just below convenience and stuff being free. If these changes were really about privacy or even resulted in improved privacy for users, Google might be forgiven for making this trade off against the interests of site owners.

It turns out that Google doesn’t even actually claim this is an attempt to improve user privacy, as Danny Sullivan pointed out. Google’s actual statement is that this move is designed to protect the privacy of the search result.

Remember when Google attacked Microsoft for copying search results and it turned out they were capturing click stream data? What Google is protecting under the guise of “privacy” is competitive intelligence.

If we assume that Google has the right to hide query strings from advertising networks and ISPs, the simplest approach would be to simply require site owners to switch to https for secure queries if they wanted to capture the query data in analytics. This would lead to significantly improved privacy and security for users without damaging content producers.

Google has already won the search advertising battle with an effective monopoly in the US and an absolute monopoly in many other countries.

The next front is in display, and cloaking the query is yet another step in leveraging their search monopoly to gain an unfair competitive advantage over other ad networks.

As Joost DeValk points out, Google is really hiding query data from competitors; data about the users Google knows the most about.

I think “privacy” is just a mere pretext. A “convenient” side effect that’s used for PR. The real reason that Google might have decided to stop sending referral data is different.

I think it is that its competitors in the online advertising space like Chitika and Chango are using search referral data to refine their (retargeted) ads and they’re getting some astonishing results. In some ways, you could therefor describe this as mostly an anti-competitive move.

In my eyes, there’s only one way out. We’ve now determined that your search data is private information. If Google truly believes that, it will stop sharing it with everyone, including their advertisers. Not sharing vital data like that with third parties but using it solely for your own profit is evil and anti-competitive.

If Google really wants to do something that improves privacy, they have to start by recognizing that Google data collection is more of an issue than any third party because they know our entire search history and can connect user queries across months or years.

Even when search history is turned off, Google is keeping track of our search progression within the search URL and can build a complex model of our query intent.

Everyone agrees that Google has a monopoly in search, but no one seems to have a clear answer on what to do about it. Perhaps this latest power grab provides a roadmap to how we can open up the industry to competition and somewhat level the playing field.

The most effective, minimally destructive solution would be for the US congress (unlikely) or the EU (possible) to require that Google share all of the click data they maintain for internal purposes.

Making all query and click data open would significantly level the playing field, make it easier for other companies to compete with a monopoly in search and display while creating an honest conversation about user privacy.

There can be no privacy solution that disguises the massive advantages Google holds in their ability to collect and analyze data. Only a scenario that aligns the interests of the search engines, ad networks and publishers and forces them to all play by the same rules can allow us to actually address user privacy.

The post Guest Opinion: Is Google’s Privacy Move Really An Anti-Competitive Practice? appeared first on Search Engine Land.

]]>
/peering-behind-googles-privacy-screen-98707/feed 2 google-blinds3-featured
Online Marketers: Stop Funding Virtual Blight /online-marketers-stop-funding-virtual-blight-13624 Mon, 24 Mar 2008 12:30:38 +0000 /beta/online-marketers-stop-funding-virtual-blight-13624.php Urban blight is easy to recognize: seedy liquor stores and payday lenders on alternating corners, trash-strewn lots and front yards, graffitti-covered buildings, crumbling sidewalks, broken glass, and billboards everywhere you look. Websites afflicted with virtual blight are just as easy to spot: banners promising hot sexy singles and cheating spouses, pornography and Viagra, payday loans […]

The post Online Marketers: Stop Funding Virtual Blight appeared first on Search Engine Land.

]]>
Urban blight is easy to recognize: seedy liquor stores and payday lenders on alternating corners, trash-strewn lots and front yards, graffitti-covered buildings, crumbling sidewalks, broken glass, and billboards everywhere you look. Websites afflicted with virtual blight are just as easy to spot: banners promising hot sexy singles and cheating spouses, pornography and Viagra, payday loans and OEM prices on Adobe’s Creative Suite 3, all bombarding us with offers that are ethically suspect and often illegal.

The devastation of urban blight is well documented. Residents flee, businesses move out, and property values plummet; the only people left are the ones who cannot afford to live anywhere else. The damage done by Virtual Blight goes well beyond the devaluation suffered by the site owner. The real damage is in the perception of the Internet as a trustworthy medium, a safe place to do business and promote your brand. Americans spends more time online than watching TV.


Internet users are more educated and affluent than TV viewers. Despite these statistics, total online marketing investment in the U.S. is less than 10% of the amount spent on television. One of the obstacles to more money flowing to online marketing is the public perception that the Internet is filled with fraud, deception, and unscrupulous people. The public is bombarded with sensational stories of scams and hackers, the allegation that the web is full of disinformation, scams, viruses, and criminals. These messages are reinforced by the evidence of blight we are presented with every day.

Consumer opinion about a brand or medium is generalized from a few specific data points. This point was illustrated in last week’s aptly titled Media Post article, Identity Thieves Also Steal Brand Equity, about brick and mortar retailers who have been hit with high profile thefts of customer information.

From the article: “These thefts zap trust,” says Ken Banks, of KAB Consulting, a marketing and brand consultant in Seminole, Fla. “If people feel they can’t trust a store with basic financial transactions, why should they trust it on anything?”

Do senior executives see the Internet as a safe place to associate with their brand? Is this perception preventing U.S. marketing dollars from flowing online? The public is skeptical of advertising claims made on television, but there is a common belief that claims made on TV pass regulatory scrutiny. Television is able to transfer belief to the viewer and provide advertisers with credibility.

To compete with the power of the subconscious influence of television, online marketers have to be diligent about where they place their brand. Current ad networks, both search and display, offer many opportunities for embarrassing or damaging brand placements. Faced with the possibility of seeing their ad on a blighted site or being tarnished by association with a casino or adult site sharing a page view, brick and mortar marketers limit their investments in online advertising.

What does this have to do with search?

Search marketers share the blame for a lot of the blight infestation. Google created the link economy nine years ago. Ever since, site owners have been trying to find ways (and short cuts) to find quick, cheap, and easy ways to dominate Google results. Link farms, comment spam, and blog spam were once the newest “tricks” to getting top rankings.

Fighting web spam is a major effort for all of the engines. Eventually, enough people discover a technique designed to improve organic ranking and suddenly, the hottest trend is labeled web spam. The savvy search marketer moves on, while others continue to spin their wheels with outdated advice, pursuing a strategy that is dead. The collateral damage from these skirmishes between marketers and engines is countless thousands of sites blighted by link hunters, spammers, and affiliate marketers; once vibrant destinations become ghost ships, depopulated sites with outdated content that link to sites that have been removed from the index.

Content targeting, which is not search marketing but usually falls under the responsibility of search marketers, also deserves a lot of blame for blight. AdSense, by providing instant monetization, did more to drive the proliferation of scraper sites and splogs than any other factor. Running your ads on thousands of sites and letting the algorithms match our message to the content of another site may sound like a great idea, but we owe it to ourselves to spend at least some time thinking through your distribution partners and making sure we are comfortable with the match.

Beyond search

Display advertising and affiliate programs go beyond the definition of search marketing, but many of us have responsibilities that extend into these arenas. Whatever your role, it is essential that we understand that the money that drives the creation of most forms of blight is coming from our marketing budget. If we fail to police our affiliates or monitor the distribution of our advertising networks, we are responsible for the impact of the campaigns just as surely as if we had programmed the comment spam bot ourselves.

Jonah Stein is founder of Its The ROI, an SEO/SEM company focusing on the art of SEO and the science of PPC and creator of Virtual Blight, a site dedicated to organizing netizens against online spam, scams & scoundrels.

The post Online Marketers: Stop Funding Virtual Blight appeared first on Search Engine Land.

]]>