The post Become an SEO rock star: Evolve your SEO skill set appeared first on Search Engine Land.
]]>The answer is always the same: True SEO professionals evolve quickly and are in a constant mode of progressing their skill set. Successful SEO is less about what you know and more about what you can get done.
Talented search professionals are not only experts at evolving their skill sets and adapting to the rapid changes Google makes on a recurring basis, they are also experts at understanding how an organization works across the numerous functions in their organization.
Every manager is looking for an individual that can learn quickly on the job and already knows how to work cross-functionally to achieve objectives. These are core characteristics of a rock star or a star on the rise in any company.
In this article, I am breaking out the core areas of self-improvement SEO professionals should be focused on to continue to progress in SEO and increase their value to the organization.
Most of these are not the run-of-the-mill SEO required skill sets, but obtaining these skills can place you in a different league in the search space where driving organic growth is only one value you are delivering.
The more you understand about areas outside of SEO that impact SEO, the more impact you can drive across the entire organization.
Marketing is a great place to start, given the common split of SEO professionals coming from either a marketing or technology background.
Below are some of the areas you should have a solid understanding about today that are marketing-centric or closely tied to the marketing function:
1. Copyright and trademark law. No need to be a lawyer, but you should understand how copyright and trademark laws work to protect intellectual property.
SEO can provide guidance to internal or external legal teams by demonstrating how individuals are capitalizing on our intellectual assets. Demonstration of the primary rule around capitalization, as well as where a violator is creating consumer confusion, is fundamental in resolving copyright and trademark issues. Infringements occurring online are likely having an impact on our SEO traffic and/or potentially how Google evaluates our website.
2. Domain management and strategy. Building on copyright and trademark law and understanding what options are available to protect against trademark infringement and resolve trademark use issues with your legal team in the domain space can be extremely helpful in fixing traffic leaks and/or preventing these leaks from occurring.
Traffic leaks are what I define as entities that exist online that are designed to siphon traffic destined for your domain. For example:
Every dollar spent marketing your brand either offline or online should be driving traffic to your website. When that is not the case, you have a traffic leak.
3. Affiliate networks. In a company I worked for previously, we had what we called the weekly thief report. The name of the report was a bit of a joke, but the report’s purpose was simply to identify where the affiliate channel was cannibalizing existing spend or efforts from other marketing channels.
Understanding how affiliate networks function and how affiliates utilize the system to make their money makes it much easier to identify where cannibalization is occurring, and more importantly, where the affiliate is failing to provide value in the customer journey.
A typo domain redirect, utilizing IFrames (Inline Frames) to duplicate content on their domain without bringing unique content or a value to the customer is just one trick that occurs in the affiliate space. There are numerous other tricks affiliates use to redirect traffic from your website to another website where they are being compensated for orders or leads.
A percentage of this traffic should have arrived at your website as an SEO referral source. Spend time with your affiliate manager to review the program so you can identify traffic leaks that are negatively impacting your SEO program. Use the information you obtain about how trademark and copyright law works to determine what options are available to resolve and prevent future traffic leak issues.
4. Paid search. Simply put, you should know exactly what is working in your paid search programs. For example, what products perform well in product listing ads (PLAs)? These are likely the best product page targets for SEO given these product uniform resource locators (URLs) have already demonstrated the ability to convert traffic. Understanding how paid search is driving clicks through their creative can help you rewrite titles and descriptions that will result in a higher click-through rate for SEO.
What keywords are bid to be visible versus bid to win? SEO and paid search should be aligned on keywords identified as bid to win to ensure both teams are fighting for top positions for these keywords. Aligning with paid search on brand term strategy can reduce spend on branded terms and allow paid search to go after traffic where it is more difficult for SEO to compete.
5. Campaign calendars. Search engine optimization does not participate in every campaign the traditional or digital marketing teams launch. Depending on timing and/or how long the campaign will run, it may not make sense to focus SEO efforts on campaign support.
It is important to be able to review a campaign calendar and identify which campaigns are going to drive search demand and what keyword searches these campaigns are going to trigger. For long-running campaigns or where the campaign has an extended reach, SEO teams should make sure there is an effort to capitalize on increased search demand versus allowing competitors to benefit from their marketing spend.
If you are new to evaluating campaign calendars, we can improve the impact of SEO participation by working with paid search teams to learn how each campaign impacted demand in the search engine results pages (SERPs) and how the paid search teams capitalized on the increased demand.
6. Creative. While this is not always present in the marketing department, marketing personnel typically work together with the creative teams on deliverables for a campaign.
Every SEO professional should place a focus on improving their creative writing skills. Spend time with your creative teams and review what content pieces resonated with the target audience and learn from their successes. Variation in content types to drive a different consumer behavior is critical when developing a content strategy for your SEO program.
Do not forget to be a good partner. SEO has access to information and tools that help us determine what keywords and questions are trending in the search engines. Share information with the creative team so they are contributing to SEO as part of their process of developing content for the customers. Every piece of content you do not create in SEO is a win in terms of scaling your program.
7. Forecasting. We can’t create an SEO forecast without understanding the objectives and process for the entire marketing channel forecast. The desired outcome for the entire marketing mix, along with knowing the objectives around cash flow after marketing, is critical to understanding what is expected from your SEO channel.
Each quarter, the percentage of revenue contribution for your channel may need to shift to allow for the entire marketing mix to be successful. Using the information you have on how your SEO program performs as it relates to cost versus revenue, current run rate and percentage of revenue contribution, take a crack at a full marketing forecast across all channels.
To refine your forecast process, obtain the same information from the other channel managers and compare your forecast to the company’s forecast quarterly, as well as the actuals reported.
Performance trends for other channels will have a direct impact on future forecasts for SEO. Marketing is a team effort: When one channel is down, another channel must step up and pick up the slack.
When evaluating job opportunities, always look for a position that will allow you to grow beyond your current SEO skill set. Exposure to other functional areas in the organization, and/or even responsibilities to manage other groups as part of your role, will help build you into a well-rounded professional.
With that said, let’s take a look at the technology list of self-improvement areas. The technology areas of concern in this section are focused on deeper levels of tech that can help you mature your SEO program through enhanced training for technical personnel, a more thorough analysis of both the bot and user experience and how to leverage technology to simplify routine SEO operations.
8. Front-end development. The ability to read and audit front-end code is an essential skill for SEO. Brush up on:
My recommendation is to always have a website you own and are maintaining. Where possible, create a website and register as an affiliate through one of the affiliate networks like Commission Junction, Rakuten LinkShare or ShareaSale.
The more you explore on the marketing and tech side with your own site, the easier it will be to evaluate websites you are optimizing. The goal you should set is to reach a point of competency that allows you to speak fluently with the front end developers about code requirements and reach the level where you can conduct SEO courses for front-end development.
9. Load balancing. Business growth is a good thing, but it requires website scaling to accommodate increases in traffic. A firm grasp of how load-balancing technologies work to route traffic gives you a significant advantage in SEO.
For example, you may be looking at massive log files conducting your log file analysis to determine how crawlers are interacting with the website, issues they are encountering and visit frequency. Load balancing allows for the creation of a replicated bot farm where bot traffic is offloaded to a group of servers running the exact code base as the customer. Segmenting this traffic allows for more crawler activity without the consumption of resources used to serve the customer.
The technology team will agree to the segmentation because offloading bot activity to dedicated servers makes it much easier to achieve consistent page load times given that the random crawl activity consumes resources that could be more efficiently serving a customer.
10. Log file analysis. Why do we want bot traffic segmented from customer traffic? One advantage is the ability to analyze logs that are serving a specific need.
Analyzing logs without having to parse out non-bot user agents to get a thorough understanding of the experience a crawler encounters as part of their crawl simplifies the process immensely. Analyzing samples of logs from the farm of servers serving the customer can help you glean information about issues with the customer experience, issues that will eventually show up in the bot logs.
Start out using applications like Screaming Frog’s log analyzer and Deep Log Analyzer, and then build up to utilizing log aggregation systems or importing logs into a database that can be queried.
Log aggregation systems may become more critical if traffic is not segmented by a user agent. Larger websites do not always create separate resource pools for customers versus crawler, but log aggregation systems allow for advanced exports where specific user agents from the logs can be targeted for export.
11. Linked data. This is an area where you can leapfrog over SEO professionals that have “rested on their laurels” while the web was moving forward.
Today, most SEO professionals fully understand how webpage linking works to provide value for a visitor and how it impacts SEO. Linked data is the next step, where data can be aggregated from multiple sources into a single resource for the customer.
Most of us are aware of linked data technologies like JavaScript Object Notation for Linked Data (JSON-LD) because Google has been clear that it’s a preferred markup language for providing additional descriptive data around elements in our websites.
Spin up on linked data to understand the end game: a connected web where information sharing is much more fluid for the customer. Instead of providing a single content page that links to other pages presenting useful information on the same topic, you can use linked data to bring that content into your page. It will also acknowledge the creator and build a more useful page for your visitor.
Beware of building pages where you have not provided added value and are just aggregating other sources. If you are an Excel user, imagine being able to perform a VLOOKUP across the web to create a single data set from numerous data sources.
Linked data is powerful, and we are only beginning to see propagation of use.
12. DNS and content delivery networks (CDNs). I am combining these two tech areas because they overlap.
For example, nearly all websites today have access to content delivery networks that help scale the website to handle more traffic. CloudFlare, Fastly, Instart Logic, Amazon S3 and Akamai are just a handful of CDNs that are in use today that have an impact on SEO.
Leveraging these resources for SEO purposes can ease the work placed on internal technology partners and allow for faster execution for the customer.
One example would be redirects. Certainly, you want to be fully aware of any domain level redirects that are set in DNS. Typo domains are a great example where domain name systems (DNS) should be used at domain level to redirect traffic, as long as the appropriate 301 response code is returned to the crawler.
Other redirects we work with for platform migrations, hypertext transfer protocol secure (HTTPS) migrations and the handling of routine redirects to accommodate discontinued content are handled much more easily at the CDN than by the internal servers.
Every task your internal servers do not have to handle makes room for the efficient execution of tasks it must handle. Offloading redirects to CDN partners eliminates bloat in configuration files, and the right CDN partners make mass redirects a breeze.
Best of all, consolidating redirects at the CDN allows both tech and SEO access to the redirect rules.
13. Website performance. I hammer home website performance in nearly every article I write because Google is always going to consider speed as a critical component of the user experience.
Focus on learning how to identify performance issues, find out who to work within your organization to resolve performance issues and learn about the performance tools of the trade. If you are working for an online retailer, and you do not know the service level agreements (SLAs) assigned to your technology team for page load times and page types, you are already behind the curve.
If your technology team does not have assigned SLAs, they are behind the curve. Work with your tech teams to track core metrics like:
Once core metrics are set, be sure to set goals that can be converted to SLAs. While performance is essential to SEO success in a mobile-first world, driving performance forward is a customer win regardless of traffic source.
Working in SEO always comes with significant homework.
The challenging part of SEO has always been adapting to changes and the ability to envision where Google is going next. Certainly, if we are focusing on improving the experience from the search engine to the fulfillment of the user’s intent, we are already working ahead of the Google algorithm.
Understanding where Google is heading next in their endeavor to measure the user experience can help us to prioritize improvements in those areas ahead of where Google will be heading later.
Keep in mind learning on the job works very well when you have the right teacher. If you do not have the right teacher, there are many resources online to help you develop your skill sets.
I frequently find myself switching from iTunes to Lynda.com to listen to a course while I’m working. If I’m focusing on a tech skill improvement, I often use the resources at O’Reilly, as the monthly cost is very reasonable and the technology areas are covered in more detail. Lynda.com’s cost is also very reasonable, but I have found it more useful for diving into the areas of marketing and analytics.
Regardless of your source of learning, make self-education a priority. Keep on top of the articles on Search Engine Land, and stay up to date on what is happening in SEO today to help inform your curriculum decisions. Never stop learning.
SEO revolves around change, and change always has a learning component. Embrace the journey, set personal development goals, and if you work in a company where a teacher does not exist, become the teacher.
The post Become an SEO rock star: Evolve your SEO skill set appeared first on Search Engine Land.
]]>The post Making website speed and performance part of your SEO routine appeared first on Search Engine Land.
]]>Based on my experience, it has become clear to me Google will place a stronger weight on the customer’s experience with page load speed as part of their mobile-first strategy. With the investment Google has made in page performance, there are some indicators we need in order to understand how critical this factor is now and will be in the future. For example:
Now that we are aware page performance is very important to Google, how do we as digital marketing professionals work speed and performance into our everyday SEO routine?
A first step would be to build the data source. SEO is a data-driven marketing channel, and performance data is no different from positions, click-through rates (CTRs) and impressions. We collect the data, analyze, and determine the course of action required to move the metrics in the direction of our choosing.
With page performance tools it is important to remember a tool may be inaccurate with a single measurement. I prefer to use at least three tools for gathering general performance metrics so I can triangulate the data and validate each individual source against the other two.
Data is only useful when the data is reliable. Depending on the website I am working on, I may have access to page performance data on a recurring basis. Some tool solutions like DynaTrace, Quantum Metric, Foglight, IBM and TeaLeaf collect data in real time but come with a high price tag or limited licenses. When cost is a consideration, I rely more heavily on the following tools:
Use multiple tools to capitalize on specific benefits of each tool, look to see if the data from all sources tells the same story. When the data is not telling the same story, there are deeper issues that should be resolved before performance data can be actionable.
While it is more than feasible to analyze a single universal resource locator (URL) you are working on, if you want to drive changes in the metrics, you need to be able to tell the entire story.
I always recommend using a sampling approach. If you are working on an e-commerce site, for example, and your URL focus is a specific product detail page, gather metrics about the specific URL, and then do a 10-product detail page sample to produce an average. There may be a story unique to the single URL, or the story may be at the page level.
Below is an example of a capture of a 10-page average across multiple page types using Google Page Speed Insights as the source.
Evaluating this data, we can see all page types are exceeding a four-second load time. Our initial target is to bring these pages into a sub-four-second page load time, 200 milliseconds or better on response and a one-second above-the-fold (ATF) load time.
Using the data provided, you can do a deeper dive into source code, infrastructure, architecture design and networking to determine exactly what improvements are necessary to bring the metrics into the aligned goals. Partnering with the information technology (IT) to establish service level agreements (SLAs) for load time metrics will ensure improvements are an ongoing objective of the company. Without the right SLAs in place, IT may not maintain the metrics you need for SEO.
Using Pingdom, we can dive a bit further into what is driving the slower page loads. The waterfall diagram demonstrates how much time each page element requires to load.
Keep in mind that objects will load in parallel, so a single slow-loading object may slow ATF load but may not impact the overall page load time.
Review the waterfall diagram to find elements that are consuming excessive load time. You can change the sort and file size to identify any objects that are of excessive size.
A common issue is the use of third-party hosted fonts and or images that have not been optimized for the web. Fonts are loaded above the fold, and if there are delays in response from a third-party font provider, it can bring the page load to a crawl.
When working with designers and front-end developers, ask if they evaluate web-safe fonts for their design. If web-safe fonts do not work with the design, consider Google fonts or Adobe Typekit.
You can also evaluate the page weight by file type to determine if there are excessive scripts or style sheets called on the page. Once you have identified the elements that require further investigation, perform a view source on the page in your browser and see where the elements load in the page. Look closely for excessive style sheets, fonts and/or JavaScript loading in the HEAD section of the document. The HEAD section must execute before the BODY. If unnecessary calls exist in the HEAD, it is highly unlikely you will be able to achieve the one-second above-the-fold target.
Work with your front-end developers to ensure that all JavaScript is set to load asynchronously. Loading asynchronously allows other scripts to execute without waiting for the prior script call to complete. JavaScript calls that are not required for every page or are not required to execute in the HEAD of the document is a common issue you find in platforms like Magento, Shopify, NetSuite, Demandware and BigCommerce, primarily due to add-on modules or extensions. Work with your developers to evaluate each script call for dependencies in the page and whether the execution of the script can be deferred.
Cleaning up the code in the HEAD of your webpages and investigating excessive file sizes are key to achieving a one-second above-the-fold load time. If the code appears to be clean, but the page load times are still excessive, evaluate response time. Response timing above 200 milliseconds exceeds Google’s threshold. Tools such as Pingdom can identify response-time issues related to domain name system (DNS) and/or excessive document size, as well as network connectivity issues. Gather your information, partner with your IT team and place a focus on a fast-loading customer experience.
Google’s algorithm will continue to evolve, and SEO professionals who focus on website experience, from page load times to fulfilling on the customer’s intent, are working ahead of the algorithm.
Working ahead of the algorithm allows us to toast a new algorithm update instead of scrambling to determine potential negative impact. Improving the customer experience through SEO-driven initiatives demonstrates how a mature SEO program can drive positive impact regardless of traffic source.
The post Making website speed and performance part of your SEO routine appeared first on Search Engine Land.
]]>The post The upcoming mobile app Monday: Be prepared appeared first on Search Engine Land.
]]>The season is upon us: mobile download season. Christmas falls on Monday, and if history holds true, Christmas and the day after will be the top mobile app download days of the year. With less than a week left, your app store optimization (ASO) activities should be in full swing.
Becky Peterson heads up our app store optimization at Walgreens. Becky was looking to be on the nice list, so she put together some optimization tips for new and existing apps to help you maximize the download season.
Capitalizing on the top download days of the year can be the difference between an average app and a top download. Keep your content fresh, do not over-optimize, and remember that the goal is to assist customers in finding the right app for the right purpose. Put in the effort, set download goals, and allocate plenty of time to respond to the flurry of reviews that occur soon after installation and use.
Remember to document your lessons learned once the season is over. Download season will be back before you know it, and those valuable lessons can be the difference-maker next year.
The post The upcoming mobile app Monday: Be prepared appeared first on Search Engine Land.
]]>The post The SEO ‘do more with less’ cookbook appeared first on Search Engine Land.
]]>“Do more with less.” How often in our careers have we heard that phrase? Ultimately, that statement always means there is a need to reduce budget while still maintaining growth (or, at a minimum, flat year-over-year performance).
The good news is that in SEO, we are the kings and queens of “do more with less.” SEO professionals today are constantly competing against significantly larger teams — unless, of course, you are working at the online gorilla Amazon or in a top affiliate organization.
Over the past 20 years working in SEO, I have worked in pureplay, omnichannel, startups and Fortune 500, and the cookbook for doing more with less contains the same recipe. Sure, the recipe may need to be modified at an ingredient level to increase servings, but the ingredients never change. What you should find in your cookbook for your “more with less” recipe is as follows:
Myself, I like to add a bit of a kick to my recipe: I step back and think big picture. How can I adjust my ingredient amounts to maximize the effort to include value for all channels?
Demonstrating impact across all channels is critical in obtaining resources to support my objectives today, and it establishes credibility within the organization long-term. The nature of our profession requires that an SEO professional routinely take off their marketing hat and explore user experience, merchandising and broader technology issues. These areas of the business have a direct impact on the performance of all marketing channels as well as direct traffic.
While we are always looking for program improvements, clearly there are times where we must squeeze the most out of the program to achieve the goals assigned to us by the company. I like to use a divide-and-conquer approach to make sure I have dedicated attention to each core growth activity.
In the divide-and-conquer approach, I typically take on the global impact improvements and task my other team members to devise a strategy to tackle the SEO-specific activities. Depending on your team size, you may have to do all the activities, or you may be able to spread them out evenly across the team. Regardless of team size, every growth opportunity area must be worked. Don’t forget to include your key partners as well when assigning out the activities.
For my part, I am going to specifically look at areas of the website where the data indicates that an improvement in user experience, merchandising and/or performance can drive additional revenue. In this example data set, I pulled landing page data from Google Analytics. This can be entry page data from Omniture or Coremetrics as well. The key area of analysis in this data set is focusing on potential opportunities by evaluating engagement metrics and conversion rate.
Looking at this hypothetical sample data set, which represents one week of data, there are a few items I have highlighted in red and green. Both are opportunities, but the green cells represent values I would want to replicate, while the red cells represent values I am targeting to improve. The primary information we are hoping to glean from the report is as follows:
Building out this model across a large number of pages provides a solid list of where more SEO revenue can be obtained where position is not the primary factor. The traffic is arriving, and certainly the quality of the traffic may differ based on position, but revenue movement is possible — and that movement has value across all traffic sources. Arguably, the improvements could result in positive movement in position, which would bring added value to the project.
The post The SEO ‘do more with less’ cookbook appeared first on Search Engine Land.
]]>The post Maximizing the potential and value of your SEO team appeared first on Search Engine Land.
]]>The complexity of working an SEO program has grown exponentially over the years, creating both challenges and opportunities for search marketers.
SEO professionals today cannot simply rely on their marketing skills to drive a program. User experience, front-end development and data analysis skills are essential for success today. Honing these skills as part of your professional development can pay dividends, both for yourself and your employer.
Managing SEO programs requires that we not only drive SEO results, but we must also drive growth opportunities for our teams. Growth opportunities and career progression are core concerns for employees. Investing in the growth of our personnel allows for driving significantly more impact across the organization.
Laura Dillon, a senior analyst on our Walgreens team, is a prime example of how SEO expertise can be leveraged to build stronger relationships with internal customers while driving value for the company. In addition to her SEO responsibilities, Laura owns the SEO contribution to site internal search.
Internal search for many websites is a key interaction and discovery tool for the consumer, but it requires management. SEO professionals understand at a deep level how a term search creates an experience path for the customer — and that path can be enhanced.
Laura works with the product, merchandising and IT teams to improve the search experience by providing trend analysis, developing strategies for holidays, working null results and improving site messaging.
Exposing partners in the organization to the value of our SEO teams builds support for our SEO initiatives. In addition, the exposure reinforces the strategy of driving SEO through experience gap analysis and an understanding of the customer’s intent.
Seek out opportunities for your team members to participate more broadly across the organization, but remember that this requires a commitment as a manager as well. The commitment we have as managers is to grow our personnel, maximize their potential, expose them to areas where their expertise can drive value and take a vested interest in their career path.
Areas where you may want to broaden the growth of your personnel to create opportunities to expand organizational impact are as follows:
Partnering well within the organization and driving impact external to SEO are primary tenets to building a best-in-class SEO program. Investing in your personnel and providing them ownership in areas where they can realize the impact of their work while growing as professionals will drive retention goals.
With every team member, I stress that SEO is not about what you know or what you can do — it is what you can get done. Partner well internally, leverage your team to their potential, and you will be amazed at how much you can actually get done.
The post Maximizing the potential and value of your SEO team appeared first on Search Engine Land.
]]>The post Developing content for the customer journey appeared first on Search Engine Land.
]]>Ten years ago, referring to content on a page as “SEO content” was often appropriate. Keyword density was still a strong factor for ranking page content, and SEO professionals struggled with achieving SEO objectives while still providing an engaging content experience for the customer.
Today, I still occasionally hear content requested and/or developed by my team referred to as “SEO content.” While it is easy to be offended, the fact is that there was a time in SEO where content quality was not our top priority, so we must own our past. Certainly Google, Bing, and Yahoo share part of the blame, as we were simply playing the hand we were dealt at the time.
Google has since reshuffled the deck, and the hand we are dealt today requires that our content compete at a quality level. Now when I hear someone refer to content as “SEO content,” I take a deep breath, and I begin my education process. The process always begins with acknowledging the past, and then it is followed with a detailed explanation of how search engine optimization has evolved into search experience optimization.
Search experience optimization is focused purely on enhancing the customer journey. These days, a search query is often the starting point of that journey. Unfortunately, we rarely know the exact phrase or keyword the customer typed into that search box to start their journey, but we know the page they landed on when they reached our website.
Based on data from Google Search Console, we know what keywords and phrases a page is ranking for, and we can therefore make a fairly educated guess as to what keywords the customer used to arrive on the page. We can then use this data to build an inventory of keywords and phrases, each of which may dictate a completely different user intent.
The page keywords and phrases serve as our inventory of the customer intents, and this allows us to perform an audit to identify the gaps in our experience. What objectives may the customer have that our website experience fails to help them achieve? We map the various intents to each stage of the customer journey, then perform a gap analysis. The gaps define the work that needs to be done to optimize our website content for the entire journey.
If you are a retailer, here are some common gaps we see:
The gap categories vary by site purpose and the defined intent. Each page and set of intents require a dedicated content plan. Gina Brock, our long-time content manager on the SEO team, consistently works through the barriers of content misconception. Her definition of what we need from our content creators helps redefine the SEO content stereotype: “When the search engines perceive the content as offering a value to the consumer, we’re naturally rewarded. What’s good for the consumer is good for SEO.”
Remember, our job is to optimize content for varying customer intents — not for search engines.
The next time you find yourself in a discussion about “SEO content,” put together a use case diagram outlining the potential intents. That will illustrate that SEO is simply utilizing the information consumers provide Google to determine their content needs and that quality is defined by how well the content serves the customer. You will be pleasantly surprised at how well future content requests are received.
The post Developing content for the customer journey appeared first on Search Engine Land.
]]>The post SEO: The missing piece in brand protection appeared first on Search Engine Land.
]]>Let’s face it — if you have worked in the industry for a while, you are aware that a stigma has existed around SEO for years. In addition to putting your site at risk for a manual penalty, questionable SEO practices can tarnish a brand’s reputation. Those of us who have properly applied SEO principles and committed to protecting our brands have gotten a bad rap due to others that have misapplied SEO for their clients or companies.
Running an enterprise SEO program for an established brand requires that one acknowledge the stigma and place a focus on changing perception. Changing perception requires action, not words. Simply educating the company on the value of SEO, or how SEO can be applied responsibly, is not enough. Strategy alignment, allocation of assigned resources, and a full demonstration of defending and enhancing the brand is critical.
Iconic brands stand the test of time by placing brand ahead of everything else — so when working with an established brand, one effective way to obtain support for an SEO program is by focusing on brand protection rather than potential traffic and revenue opportunity.
Explaining the role of SEO in brand protection is a compelling argument for establishing and investing in an SEO program. Position your program as a defender of the brand first and a revenue stream second. Brand recovery opportunities come by at a much lower frequency than revenue opportunities, and single misplacement of priority can be devastating.
My colleague John Curtis, who runs the day-to-day SEO operations for our team at Walgreens, made an excellent point in a recent discussion about brand protection: If you relinquish your brand story to others, you lose control of the brand and the message.
Customers seek information about your brand, your products and your services. If you fail to provide the resources or answer the questions, someone else will become the source of information about your company. Thus, it’s critical for web properties that you control and manage to appear in search engine results for queries related to your brand. External sources are typically not brand-exclusive, nor do they adhere to the company’s brand guidelines.
When brand term visibility is not a priority for the company, you run the risk of customers being exposed to incorrect information or a misrepresentation of the brand. Allowing external sources to become the source of information about your brand leads to loss in revenue, and potential negative brand experiences.
Companies with brick-and-mortar operations and/or mobile apps have far more SEO touch points where the brand must be protected. Our resident local SEO expert, Kyle Eggleston, explained the impact of local SEO on the brand in a single sentence: “People blame your brand, not Google, when wrong information is displayed in the local search results.”
Eggleston further noted that there is a significant amount of information provided around local search queries that details exactly what customers are looking for in your store, and you can leverage that by making that information more easily accessible.
The next time you find yourself presenting an SEO strategy where you are seeking support from your organization, be sure to include the goals around brand protection in your primary points of discussion. Below is a list to reference that outlines why SEO is arguably the most critical program in terms of protecting the brand online:
Regardless of your role in the company, be accountable to the brand. If your role is in SEO, reach out to brand managers or marketing executives in your company and solidify your understanding of the brand you are defending. The more the organization is exposed to the role SEO plays in brand protection and brand enhancement, the faster the stigma of “bad for the brand” will disappear.
The post SEO: The missing piece in brand protection appeared first on Search Engine Land.
]]>The post Weathering the Google storms appeared first on Search Engine Land.
]]>A good friend of mine and truly the best SEO expert I have had the privilege working with, Gregory Gromov, once referred to the Google algorithm updates and tests as “Google storms.” The coined phrase made all the sense in the world. Per Gregory, a solid SEO program provides the ballast to weather the storm, but if a storm hits and flips you over… well, it is time to right the ship.
A Google algorithm update is actually a rare opportunity. While in some cases it may appear to be more of a nightmare than a dream come true, understanding how to capitalize on the event is key to succeeding in SEO — and as your program matures, you will look forward to the updates.
The following is a process I have used for years to evaluate Google updates at a site level to glean new opportunities for improvement and to determine what is already working. This is a second-level analysis after I have reviewed patterns related to page title, meta description and H1 tags.
The first step in the analysis is establishing the timeline of the update. Search Engine Land usually publishes information quickly when an update is confirmed, so it should be pretty easy to get an approximate date range. Once you have a date range, begin with Google Search Console (GSC) running comparisons within the date range on top pages and determine the date of impact.
After establishing the approximate start date of impact, grab two weeks prior to the date and two weeks from the date forward. Export the data out to Excel, and add a column calculating the change in position between the two weeks. I like to look at positive change first; so, after creating the calculated change column, filter for values greater than 0.
Note: The data represented is sample data compiled for demonstration purposes; the data is not from a live domain.
As we continue the process, we will be pulling data from multiple sources and then combining the data sources to form a full view of all the critical SEO data points. Before we move to the next step, though, we should review the GSC data by itself to see if a pattern exists.
Identifying patterns requires pages that have similar metrics. Add a filter for the range of change that is outside of your normal variation. For example, if your website average position for your high value pages moves up or down between 0.1 and 0.3, as is typical for page 1 rankings, then add a filter to your change column that exceeds the normal variation. For the purposes of my demonstration, we are going to look at ranges between 0.5 and 0.99. A half-position change for a page one position is significant in a two-week time frame.
Now that we have a set of data to analyze, we want to grab a sample of pages that have similar metrics. Pages with similar position change, average position and click-through rates (CTR) make a good sample.
Reviewing the GSC data alone in this demonstration did not produce results. Position changes in the positive range in our data set do not have common data elements on a whole. The CTR variation is significant, which tells us the positive fluctuation was not a result in a weight change in the algorithm related to the CTR… so on to the next step we go.
The next step is to use the crawler of your choice. I prefer Screaming Frog due to the speed of gathering the data. Copy and paste into a text document the list of your URLs that are in the range of change you are analyzing, then perform a crawl on the URLs using the list option in Screaming Frog.
For this pass, we are focused on the data elements from the crawl versus the meta data elements like description, title and so on. The idea here is to just look for data similarities that stand out. We are looking for obvious standouts. Rarely will you find the exact data point you are looking for to correlate to the position change during this step. Typically, it is through the entire process where we find multiple data points that correlate.
In this example, I found a couple of correlations related to page size and response time. I know this crawl is isolated only to the group of URLs that improved in the range of a half-position, so these correlations are important to set to the side. Now I am at the point where I want to see the full picture of these URLs.
Completing the picture requires exporting engagement metric data from your analytics package. Google Analytics provides this view in the Landing Pages report under Site Content. If you are using Google Analytics, export the data as well from the Site Speed -> Page Timings report. For this data we can use just the two weeks from the date of change.
Obtaining the full view of the data requires consolidating the data sources. The issue with data consolidation is that Google does not utilize the same format for URLs across Google Analytics and Google Search Console. Luckily, there is a free tool to standardize the URLs. Simply take the copy and paste the URLs from GSC and Screaming Frog into this tool, and it will strip the URLs down to the root page:
Copy and paste the stripped URLs into the Excel spreadsheet for GSC and Screaming Frog.
If you are working with smaller data sets and/or just prefer to work directly in Excel, you can use the following formula with some minor adjustments to strip the full URL down to the page:
=RIGHT(A3,LEN(A3)-FIND("/",A3,FIND("//",A3)+2))
Combining the data requires linking spreadsheets and performing vlookups — or what I prefer is to use a database tool. With larger data sets, you can use Access and combine the data quickly. Using the URLs as the joining column, you can produce a spreadsheet with all the core data elements you need to find correlations.
Beyond evaluating the data, URLs with similar data points should be evaluated at a page level. For example, is the content in page similar in ratio, and if customer reviews or Q&A are present, are they present in the similar pages in similar volume?
Document all of the metrics that correlate so we can validate the correlations later. In the sample data above there is an important correlation we will want to investigate.
The conclusion we can draw here is that the pages that received a significant positive impact all had two correlating data points. Page load time was below four seconds and bounce rate was below 33 percent.
The last step in this analysis project is to filter the data for negative change and evaluate the pages with significant declines in position and validate the theory that page load time and bounce rate exceeded the values from the positive change data set. If the data demonstrates the negative metrics are on the opposite end of the spectrum, you know exactly what you need to do to bring the negative positions up and improve the pages that improved after the algorithm update even further.
Quality updates are a goldmine for SEO improvement. Next time a verified update rolls out, run the analysis and find the hidden gems.
The post Weathering the Google storms appeared first on Search Engine Land.
]]>