Без рубрики

02.12.2023

Every website relies on Google to some extent. It’s simple: your pages get indexed by Google, which makes it possible for people to find you. That’s the way things should go. However, that’s not always the case. Many pages never get indexed by Google. If you work with a website, especially a large one, you’ve probably noticed that not every page on your website gets indexed, and many pages wait for weeks before Google picks them up. Various factors contribute to this issue, and many of them are the same factors that are mentioned with regard to ranking — content quality and links are two examples. Sometimes, these factors are also very complex and technical. Modern websites that rely heavily on new web technologies have notoriously suffered from indexing issues in the past, and some still do. Many SEOs still believe that it’s the very technical things that prevent Google from indexing content, but this is a myth. While it’s true that Google might not index your pages if you don’t send consistent technical signals as to which pages you want indexed or if you have insufficient crawl budget, it’s just as important that you’re consistent with the quality of your content. Most websites, big or small, have lots of content that should be indexed — but isn’t. And while things like JavaScript do make indexing more complicated, your website can suffer from serious indexing issues even if it’s written in pure HTML. In this post, let’s address some of the most common issues, and how to mitigate them. Reasons why Google isn’t indexing your pages Using a custom indexing checker tool, I checked a large sample of the most popular e-commerce stores in the US for indexing issues. I discovered that, on average, 15% of their indexable product pages cannot be found on Google. That result was extremely surprising. What I needed to know next was “why”: what are the most common reasons why Google decides not to index something that should technically be indexed? Google Search Console reports several statuses for unindexed pages, like “Crawled — currently not indexed” or “Discovered — currently not indexed”. While this information doesn’t explicitly help address the issue, it’s a good place to start diagnostics. Top indexing issues Based on a large sample of websites I collected, the most popular indexing issues reported by Google Search Console are: 1. “Crawled — currently not indexed” In this case, Google visited a page but didn’t index it. Based on my experience, this is usually a content quality issue. Given the e-commerce boom that’s currently happening, we can expect Google to get pickier when it comes to quality. So if you notice your pages are “Crawled — currently not indexed”, make sure the content on those pages is uniquely valuable: Use unique titles, descriptions, and copy on all indexable pages.Avoid copying product descriptions from external sources.Use canonical tags to consolidate duplicate content.Block Google from crawling or indexing low-quality sections of your website by using the robots.txt file or the noindex tag.If you are interested in the topic, I recommend reading Chris Long’s Crawled — Currently Not Indexed: A Coverage Status Guide. 2. “Discovered — currently not indexed” This is my favorite issue to work with, because it can encompass everything from crawling issues to insufficient content quality. It’s a massive problem, particularly in the case of large e-commerce stores, and I’ve seen this apply to tens of millions of URLs on a single website.Google may report that e-commerce product pages are “Discovered — currently not indexed” because of: A crawl budget issue: there may be too many URLs in the crawling queue and these may be crawled and indexed later.A quality issue: Google may think that some pages on that domain aren’t worth crawling and decide not to visit them by looking for a pattern in their URL.Dealing with this problem takes some expertise. If you find out that your pages are “Discovered — currently not indexed”, do the following: Identify if there are patterns of pages falling into this category. Maybe the problem is related to a specific category of products and the whole category isn’t linked internally? Or maybe a huge portion of product pages are waiting in the queue to get indexed? Optimize your crawl budget. Focus on spotting low-quality pages that Google spends a lot of time crawling. The usual suspects include filtered category pages and internal search pages — these pages can easily go into tens of millions on a typical e-commerce site. If Googlebot can freely crawl them, it may not have the resources to get to the valuable stuff on your website indexed in Google.During the webinar «Rendering SEO», Martin Splitt of Google gave us a few hints on fixing the Discovered not indexed issue. Check it out if you want to learn more. 3. “Duplicate content” This issue is extensively covered by the Moz SEO Learning Center. I just want to point out here that duplicate content may be caused by various reasons, such as: Language variations (e.g. English language in the UK, US, or Canada). If you have several versions of the same page that are targeted at different countries, some of these pages may end up unindexed.Duplicate content used by your competitors. This often occurs in the e-commerce industry when several websites use the same product description provided by the manufacturer.Besides using rel=canonical, 301 redirects, or creating unique content, I would focus on providing unique value for the users. Fast-growing-trees.com would be an example. Instead of boring descriptions and tips on planting and watering, the website allows you to see a detailed FAQ for many products. Also, you can easily compare between similar products. For many products, it provides an FAQ. Also, every customer can ask a detailed question about a plant and get the answer from the community. How to check your website’s index coverage You can easily check how many pages of your website aren’t indexed by opening the Index Coverage report in Google Search Console. The first thing you should look at here is the number of excluded pages. Then try to find a pattern — what types of pages don’t get indexed? If you own an e-commerce store, you’ll most probably see unindexed product pages. While this should always be a warning sign, you can’t expect to have all of your product pages indexed, especially with a large website. For instance, a large e-commerce store is bound to have duplicate pages and expired or out-of-stock products. These pages may lack the quality that would put them at the front of Google’s indexing queue (and that’s if Google decides to crawl these pages in the first place). In addition, large e-commerce websites tend to have issues with crawl budget. I’ve seen cases of e-commerce stores having more than a million products while 90% of them were classified as “Discovered — currently not indexed”. But if you see that important pages are being excluded from Google’s index, you should be deeply concerned. How to increase the probability Google will index your pages Every website is different and may suffer from different indexing issues. However, here are some of the best practices that should help your pages get indexed: 1. Avoid the “Soft 404” signals Make sure your pages don’t contain anything that may falsely indicate a soft 404 status. This includes anything from using “Not found” or “Not available” in the copy to having the number “404” in the URL. 2. Use internal linkingInternal linking is one of the key signals for Google that a given page is an important part of the website and deserves to be indexed. Leave no orphan pages in your website’s structure, and remember to include all indexable pages in your sitemaps. 3. Implement a sound crawling strategyDon’t let Google crawl cruft on your website. If too many resources are spent crawling the less valuable parts of your domain, it might take too long for Google to get to the good stuff. Server log analysis can give you the full picture of what Googlebot crawls and how to optimize it. 4. Eliminate low-quality and duplicate contentEvery large website eventually ends up with some pages that shouldn’t be indexed. Make sure that these pages don’t find their way into your sitemaps, and use the noindex tag and the robots.txt file when appropriate. If you let Google spend too much time in the worst parts of your site, it might underestimate the overall quality of your domain. 5. Send consistent SEO signals. One common example of sending inconsistent SEO signals to Google is altering canonical tags with JavaScript. As Martin Splitt of Google mentioned during JavaScript SEO Office Hours, you can never be sure what Google will do if you have one canonical tag in the source HTML, and a different one after rendering JavaScript. The web is getting too big In the past couple of years, Google has made giant leaps in processing JavaScript, making the job of SEOs easier. These days, it’s less common to see JavaScript-powered websites that aren’t indexed because of the specific tech stack they’re using. But can we expect the same to happen with the indexing issues that aren’t related to JavaScript? I don’t think so. The internet is constantly growing. Every day new websites appear, and existing websites grow. Can Google deal with this challenge? This question appears every once in a while. I like quoting Google here: “Google has a finite number of resources, so when faced with the nearly infinite quantity of content that’s available online, Googlebot is only able to find and crawl a percentage of that content. Then, of the content we’ve crawled, we’re only able to index a portion.​” To put it differently, Google is able to visit just a portion of all pages on the web and index an even smaller portion. And even if your website is amazing, you should keep that in mind. Google probably won’t visit every page of your website, even if it’s relatively small. Your job is to make sure that Google can discover and index pages that are essential for your business.

The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz. Every website […]
02.12.2023

It’s been a long time since Moz last published an in-house ranking factor study, and also a long time since I last published one prior to joining Moz. In my case, this is partly due to my long-standing skepticism and caution around how studies like these are typically very loudly misinterpreted or misrepresented. There’s also the complexity and difficulty of quantifying on-page factors within Google’s increasingly nuanced and sophisticated interpretation of relevance (although, yes, we’re working on it!). Nonetheless, I think there’s value in a narrower study (or studies), for a few reasons. Firstly, it can be useful to set a comparison point that we might revisit — perhaps if we notice a change in Google’s algorithm, or if we think a given industry or set of keywords might be untypical. Secondly, we might still wish to compare narrower sets of metrics — such as link vs. domain level linking factors, follow vs. dofollow links, or branded search volume vs. Domain Authority — and this, too, requires a baseline. Lastly, there’s some merit in reaffirming what we would expect to be true. How to interpret a correlation study It’s a cliché to say that correlation does not imply causation, but one that few seem to remember in this context. I’ve written before at length about interpreting correlations, but if you don’t want to go back and read all that, I think the main thing to check before you go any further is whether you can simultaneously accept all of the following to be true: Links are a fundamental part of how Google worksLinks are correlated with rankingsBuilding links may not always improve rankingsSometimes links are a symptom, rather than a cause, of SEO performanceI’m not asking you to agree with all those statements, just to be open to this kind of interplay when you consider studies like this one and how they affect your worldview. As it happens, though, whatever you or I may think, most SEOs do still hold that links directly improve rankings, which seems reasonable. But surprisingly, a narrow majority will not say this without qualification: this recent study from Aira shows the commonly-cited caveats of a lack of technical issues, and of some verticals not really benefiting. What counts as a good correlation? When looking at large datasets and very complex systems, any one metric having a non-zero correlation is worth paying attention to, but obviously some context is needed, and comparison between metrics can be useful for this. For the sake of this study, it’s probably more useful to compare correlation values between metrics than to get hung up on specific absolute values. With all that said, then, let’s get into the data. Methodology This study is based on the first 20 organic results for every MozCast keyword (10,000 keywords), on both desktop and mobile, from a suburban location in the USA. Spearman’s rank correlation is used, as we’re comparing ranked variables (organic ranking) with logarithmic(ish) metrics like DA, and variables with extreme high-end values (like link counts). Using Spearman’s rank allows us to ask whether the order in which results appear is the one we’d expect based on a given metric, rather than getting bogged down in issues around different SERPs having vastly different distributions of link-count or DA. Page, Subdomain, and Domain-level external links In this chart, we look at how the number of links to a page’s domain predicts its ranking, compared to the number of links to a subdomain, compared to the page itself. Keen students of SEO theory will be unsurprised to see page-level links being by far the most potent predictor. I’m sure this data will feel vindicating to SEOs and digital PRs who swear by building links directly to product or category pages, and they may have a point. However, there are a couple of things to keep in mind: Often, homepages are the most linked-to page on a site. We shouldn’t be surprised to see homepages rank well in the SERPs where they’re relevant, and that is some of what this data describes.You can achieve, from a PageRank perspective, a similar effect to direct page-level link building through the use of internal links. (Depending where your built links are pointing, of course.)Links vs. Authoritative Links This is perhaps another chart that more reaffirms what we’d hope than blows anyone’s mind, but yes, Moz’s DA and PA metrics — which look at the overall authority as well as quantity of links to a domain or page — do outperform raw followed link count. That said, I may find this unsurprising, but plenty of brands and agencies out there still do KPI link building campaigns based on link count, so perhaps this chart will be of particular interest in their case! Branded Search Volume vs. Domain-level This comparison is an old favorite of mine, and illustrates some of the reasons why link-level factors are valued by Google in the first place: they were, originally, a proxy for popularity. Those of you paying attention may actually be surprised that DA outperforms Branded Search Volume here. That does tend to be the case as you get deeper into search results. If we look at the top 10 only, you see lower correlations in general (due to the smaller dataset), but the ordering is a little different: This is a similar finding to studies I’ve published before, and makes sense when you consider the competitive and data rich environment on the first page for competitive terms. Does this mean branded search volume is a ranking factor? Not necessarily! And this is the type of conclusion I was seeking to warn you about earlier. Brand very likely is an important part of what Google is trying to measure with links, as ultimately they want to give us results that we trust and want to click on. Presumably, Google’s engineers are not narrow-minded enough to think that links are the only way they could measure brand, given the wealth of data at their disposal, but whether branded search volume specifically is used is anyone’s guess. What we can see is that it very likely correlates with things that are used — just as DA is not directly used by Google, but correlates very well with things that are. Similar to click-based metrics, there’s a semantic debate to be had here around whether something that Google is optimizing towards in its algorithm — but possibly not directly using as an input — constitutes a ranking factor. Certainly you should not take away that your best bet is to directly manipulate branded search volume by generating a load of artificial searches. That said, naturally causing people to search for your brand, especially in conjunction with relevant product terms, can only be a good thing. Whatever Google is measuring (whether it be links, search volume, clicks, or anything else) is likely to be improved by the same activities you’d use to naturally raise branded search traffic. Which is, of course, probably why it correlates so well. Takeaways No major shocks: “links correlated with rankings, SEO study finds!” But, there are some important reminders here: Page-level performance is important, however you go about achieving itRaw link count isn’t a great metricDemand for your brand is at least as good a predictor of rankings as domain strength on the first pageLike I said above, though, please do remember in any incendiary tweets you’re now penning that the relationships behind these correlations can be more complex than meets the eye!

The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz. It’s been […]
02.12.2023

If you know Moz, you know the Beginner’s Guide to SEO. It’s the resource marketers the world over have used to learn SEO and get a taste for its potential and power. And while we offer a delectable buffet of guides in our content smörgåsbord, there hasn’t been one comprehensive resource to serve as a follow-up for those who’ve mastered the beginner level. That’s why we’ve developed the Professional’s Guide to SEO: a guide that will help folks take the next step, preparing them with all the baseline knowledge they need to practice SEO in a professional capacity. Over the coming weeks, we’ll be sharing chapters and/or chapter excerpts here on the blog, with the full guide releasing at the end. Yes, we want to whet your appetite, but we’d also love to hear your feedback — if there’s something you know in your heart of hearts we should cover, get at us on Twitter (@moz) and let us know! First up, we’re sharing a portion of our chapter on advanced SEO strategy. Brought to you by the inimitable Kavi Kardos, Moz alumni and SEO Manager at Automox, this chapter looks at getting started with a next-step strategy, tactics to implement, and resources for leveling up. Bon appétit! Advanced SEO Strategy Getting started: SEO priorities & plausibility In the beginning stages, it’s easy to audit a site and come up with long lists of pie-in-the-sky ideas for content, link building, technical, and so on. Most sites, especially those that have never been handled by an advanced SEO, need a lot of work, and the new strategist arriving on the scene often gets pulled in several directions by various teams seeking their expertise. Prioritization of the tasks you’ll undertake and the tactics you’ll employ is a vital first step in developing an advanced SEO strategy. And it’s important to work on this step thoughtfully — ask questions, be realistic, and involve as many stakeholders as you’re able to meet with. A misstep in the prioritization stage can throw off your schedule for the whole quarter and cause important tasks to fall through the cracks. Try a SWOT analysis It may feel old-fashioned, but the classic SWOT analysis (identifying strengths, weaknesses, opportunities, and threats) is a great way to frame your initial site audit because it will familiarize you with both the website itself and the competitive landscape in which it lives. As you explore both, jot down your thoughts in a Google Doc that you can return to whenever you discover something new. Strengths: What is already working well? What high-value terms does the site already rank for? What high-authority sites already link to you? Does the site already score well in page speed and performance tests or avoid other common technical snafus?Weaknesses: What is the site lacking? Is it difficult to navigate? Are its sitemaps and robots.txt file messy? Is the organization lacking insight because it doesn’t make use of basic reporting tools like Google Analytics? Does it have a lackluster content strategy?Opportunities: What’s on the horizon that could be capitalized on as part of your strategy? Is there a highly valuable asset that’s already been created and is now begging for distribution? Is a tough competitor lagging behind in a certain content area?Threats: What’s on the horizon that could be harmful to your search visibility? Is there an up-and-coming competitor with an obvious wealth of SEO resources? Is there a platform migration looming? Is the site likely to fall victim to the next algorithm update?Being conscious of the site’s current standings, both in the SERPs and in terms of its overall health, will help you prioritize tactics based on urgency. The most dire threats should usually be addressed first, while minor weaknesses can often be moved to your “nice to have” list. Assess the organization’s search maturity Regardless of how urgent the need is or how simple a task seems to you, the difficulty of getting your SEO recommendations implemented will vary from organization to organization. The plausibility of executing your strategy depends largely on the organization’s search maturity, or how fully they understand and integrate SEO at all levels of the business. The concept of search maturity was developed by Heather Physioc of VMLY&R, and her guidance on diagnosing where your organization falls along the maturity spectrum is an absolute must-read at this stage in the strategic planning process. Not only does using this model help you solidify your recommendations; it also makes it more likely that those recommendations will see the light of day because it allows you to communicate with stakeholders on their level. How much buy-in can you expect from your department, your direct manager or client contact, and the rest of the larger team all the way up to the C-suite? If SEO has been socialized across the organization and is already a part of the company culture, you can probably expect your recommendations to be met with excitement. If not, you may experience some pushback when asking for necessary resources. At an agency, you’ll be dealing with the confines of existing SEO packages as well as the amount of time you’re expected to spend on each client each month. As an in-house SEO, you may have more autonomy but must often answer to more stakeholders and navigate more red tape. How difficult will it be to get recommended changes implemented? If the content team has an existing calendar that tends to be jam-packed, new assets may not get slotted in as quickly as you’d like. If the web devs are slammed, working back-end fixes into their sprint cycle can be challenging. What resources will be available for SEO? Resources come in many forms, and the most scarce of them tend to be headcount and tools. Are there writers on staff who are capable of creating best-in-class content? Does the marketing team have dedicated developers, or are the folks with access to the site’s code in a totally separate department? What tool subscriptions already exist, and how much budget is available to add to your tool kit? Create an impact vs. effort matrix Once you know which areas of the site need the most help the fastest, it’s time to make a list of recommended tactics and further prioritize that list by likely impact weighed against required effort, based on what you learned in the previous step.Create a matrix like the one above, perhaps in a meeting with relevant stakeholders. The likely impact of a tactic could be small, medium, or large, and the same scale will apply to the level of effort required to complete it. Plot each planned tactic into its own cell. Your list of tactics for the quarter, the year, or whatever time frame is dictated by your organization can include granular tasks as well as larger-scale projects — just make sure you’ve broken down any bigger ideas into pieces that make sense within the plot. Taking urgency into account, tackle the tactics that will have the highest impact and require the lowest effort first. You may also want to set in motion some more demanding, high-impact tactics at kickoff if they can be chipped away at simultaneously. Low-impact, high-effort tactics can often be reevaluated. Want more news about the Professional’s Guide to SEO? Don’t miss any of our future sneak peeks — make sure you’re signed up for Moz Blog email updates! Keep me updated!

If you know Moz, you know the Beginner’s Guide to SEO. It’s the resource marketers the world over have used to learn SEO and get a […]
02.12.2023

More brands than ever are investing and producing quality journalism to drive their earned media strategy. They recognize that it’s a valuable channel for simultaneously building authority while finding and connecting with customers where they consume news. But producing and distributing great content is no easy feat. At Stacker and our brand-partnership model Stacker Studio, our team has mastered how to create newsworthy, data-driven stories for our newswire. Since 2017, we’ve placed thousands of stories across the most authoritative news outlets in the country, including MSN, Newsweek, SFGate, and Chicago Tribune. Certain approaches have yielded a high hit rate (i.e., pick up), and one of our most successful tactics is helping add context to what’s going on in the world. (I mentioned this as a tactic in my Whiteboard Friday, How to Make Newsworthy Content: Part 2.) Contextualizing topics, statistics, and events serves as a core part of our content ideation process. Today, I’m going to share our strategy so you can create content that has real news value, and that can resonate with newsroom editors. Make a list of facts and insights You likely have a list of general topics relevant to your brand, but these subject areas are often too general as a launching point for productive brainstorming. Starting with “personal finance,” for example, leaves almost too much white space to truly explore and refine story ideas. Instead, it’s better to hone in on an upcoming event, data set, or particular news cycle. What is newsworthy and specifically happening that’s aligned with your general audience? At the time of writing this, Jack Dorsey recently stepped down as CEO of Twitter. That was breaking news and hardly something a brand would expect to cover. But take the event and try contextualizing it. In general, what’s the average tenure of founders before stepping down? What’s the difference in public market success for founder-led companies? In regard to Parag Agrawal stepping into the CEO role, what is the percentage of non-white CEOs in American companies? As you can see, when you contextualize, it unlocks promising avenues for creative storyboarding. Here are some questions to guide this process. Question 1: How does this compare to similar events/statistics? Comparison is one of the most effective ways to contextualize. It’s hard to know the true impact of a fact when it exists stand alone or in a vacuum. Let’s consider hurricane season as an example. There’s a ton of stories around current hurricane seasons, whether it’s highlighting the worst hurricanes of all time or getting a sense of a particular hurricane’s scope of destruction or impact on a community. But we decided to compare it another way. What if we asked readers to consider what hurricane seasons were like the year they were born? This approach prompts a personal experience for the readers to compare what hurricane seasons are like now compared to a more specific “then” — one that feels particularly relevant and relatable. I’ll talk more about time-based comparisons in the next section, but you can also compare: Across industries/topics (How much damage do hurricanes do compared to tidal waves?)Across geographic areas (Which part of the ocean is responsible for the most destructive hurricanes? Where has the most damage been done around the world?)Across demographics (Which generation is most frightened of hurricanes?)There are dozens of possibilities, so allow yourself to freely explore all potential angles. Question 2: What are the implications on a local level? In some cases, events or topics are discussed online without the details of how they’re impacting individual people or communities. We might know what something means for a general audience, but is there a deeper impact or implication that’s not being explored? One of the best ways to do this is through localization, which involves taking a national trend and evaluating how it’s reflected and/or impacts specific areas. Newspapers do this constantly, but brands can do it, too. For example, there are countless stories about climate change, but taking a localized approach can help make the phenomenon feel “closer to home.” We put together a piece that illustrated significant ways climate change has affected each state (increased flooding in Arkansas, the Colorado River drying up, sea levels rising off South Carolina, etc.). You could take this a step further and look at a particular city or community if you had supporting data or research. If you serve particular markets, it’s easy to implement this strategy. Orchard, for example, does a great job publishing real estate market trend reports in the areas they serve. But if you’re a national or international brand that doesn’t cater to specific regions, try using data sets that have information for all countries, states, cities, ZIP codes, etc., and present all of it, allowing readers to identify data points that matter to them. When readers can filter data or interact with your content, it allows them to have a more personalized reading experience. Question 3: What sides of the conversation have we not fully heard yet? The best way to tap into the missing pieces of a story is to consider how other topics/subject areas interact with that story. I’ll stick with our climate change theme. We did the story above on how climate change has impacted every state, which feels comprehensive about the topic, but there’s more to dive into. Outside of just thinking how climate change is impacting geographic areas, we asked ourselves: How is it affecting different industries? Now we have a look at a more specific angle that’s fascinating — how climate change has impacted the wine industry. When you have a topic and want to uncover less-explored angles, ask yourself a set of questions that’s similar to the compare/contrast model: How does this topic impact different regions? (E.g. What is wine’s cultural role in various countries?)How does this topic impact different demographics of people? (E.g. Who profits most from wine making?)How does this topic impact different industries? (E.g. How have wineries/vineyards impacted tourism?)How is this topic impacted by these various things? (E.g. How is the flavor of wine impacted by region? Who buys the most wine, and where do they live?)This should create a good brainstorming foundation to identify interesting hooks that aren’t often explored about a really common topic. Conclusion Not only will taking the approach of contextualizing differentiate your story from everything else out there, it will also allow you to re-promote it when a similar event occurs or the topic trends again in the future. Contextualized content is often this perfect blend of timeliness and evergreen that’s really difficult to achieve otherwise.

The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz. More brands […]