Fixing ‘Crawled – Currently Not Indexed’ Pages

Crawled - Currently Not Indexed Pages

Having pages appear as “Crawled – Currently Not Indexed” in Google Search Console can be frustrating for website owners aiming to improve their site’s performance in search results. This status means that while Google is aware of the pages and has crawled them, it has opted not to include them in its search index.

Contents hide
2 Guide to Fixing ‘Crawled – Currently Not Indexed’ Pages

The reasons behind this indexing decision can vary. Sometimes, technical issues block Google from crawling or indexing the pages adequately. Other times, the pages may have quality issues or lack unique value compared to competing pages.

Regardless of the specific reason, website owners want to better understand what is causing the lack of indexing and what steps they can take to get these pages fully indexed. Read on, dear friend, and we will show you how to go from zero to hero in double time.

About Google Search Console

Screenshot of Google Search Console

Google Search Console is a powerful tool offered by Google that helps website owners understand how their site is performing in Google’s search results. It provides insights into how Google views your site, including how often your pages appear in search results, which search queries show your site, and how users end up there.

Essentially, it’s a dashboard that offers a glimpse into your site’s visibility on the web, helping you optimise your content for better search engine performance.

One of the statuses you might encounter in Google Search Console is “Crawled – currently not indexed.” This status indicates that while Google has visited (or “crawled”) your pages, it has chosen not to include them in the search index. As a result, these pages won’t appear in search results. Understanding why certain pages receive this status and how to address the issue is crucial for website owners looking to maximise their site’s visibility and performance in Google’s search results.

In this article, we’ll dive into the reasons behind this particular status and offer actionable strategies to help get your pages indexed and visible to your intended audience.

Guide to Fixing ‘Crawled – Currently Not Indexed’ Pages

This Guide will provide sound advice and actionable steps SEOs and website owners can implement across five key stages to reduce ‘Crawled – Currently Not Indexed’ issues:

[ez-toc]

By methodically working through fixes and improvements at each stage, pages currently marked as “Crawled – Currently Not Indexed” can better meet Google’s indexing criteria. The desired outcome is to include these pages in Google’s index to enhance the site’s visibility and search traffic.

Step 1: Cultivate High-Quality Content

The foundation of any effective search engine optimisation strategy is creating high-quality content that resonates with your target audience and satisfies their intent when searching for specific queries. For pages marked as “Crawled – Currently Not Indexed”, examining your existing content quality and enriching it must be the priority.

Google strongly prefers comprehensive, informative, engaging, and authoritative content centred around optimising for what people seek. Their search quality evaluators emphasise E-E-A-T (e.g., how are you demonstrating your Experience, Expertise, Authoritativeness, and Trustworthiness on this topic?) and YMYL (Your Money or Your Life, or – is this topic critical or somewhat important in the grand scheme of things?) when analysing content for inclusion in search results.

Here are crucial elements to focus on when cultivating your content quality:

  • Comprehensive Depth: Craft content around target keywords that dive extensively into the topic. Provide ample background context and research to demonstrate your command of the subject matter. Expand on specific sub-topics in greater detail, including niche questions searchers may have.
  • Informative Value: Present information in a format that educates readers on crucial concepts central to your topic. Break down complex subjects into understandable overviews before diving into specifics. Emphasise practical, real-world value with advice, how-tos, explainers, comparisons, and list-based content when applicable.
  • Engaging Format: Structure your content for maximum scanability and engagement. Incorporate visuals like infographics, stats, images and videos for added impact. Use formatting elements appropriately, including headings, sub-headings, bolded font, bulleted lists, tables, and more based on the medium.
  • Authoritative Voice: Establish expertise and credibility through the information you spotlight, sources cited, and overall tone used. Provide unique insider perspectives, industry knowledge, real-world expertise and proprietary research when possible.

Regularly assess and update existing content to defend its relevance and value over time. Outdated or subpar legacy content will diminish your odds of indexing, so continue to refine and improve pages even after initial publication.

Optimising for evergreen, authoritative content while aligning with Google’s specified guidelines gives you the best shot at fixing your ‘Crawled – Currently Not Indexed’ pages. Prioritise building a content portfolio with high-quality pages covering subjects with substantial search volume and user value.

Learn more about High Quality Content

Step 2: Master Your Index Coverage Monitoring

Once you have cultivated higher-quality content, the next step is rigorously monitoring your overall site’s index coverage and any changes in Google’s indexing patterns. For pages marked as “Crawled – Currently Not Indexed”,  keeping continuous tabs on their status is crucial for diagnosing issues and demonstrating progress.

Google Search Console should become your indispensable indexing companion for this task. Regularly check in on crucial reporting features, including:

  • Index Coverage Report: View what percentage of submitted pages are included in Google’s main index, fully indexed but discoverable or excluded. Monitor changes over time.
  • Index Status Report: Breaks down indexed vs excluded pages by specific indexing state. Isolate pages marked as “Crawled – Currently Not Indexed” for further inspection.
  • Enhancement Reports: Identify site-wide technical enhancements for potential indexing obstacles like fixing crawl errors, improving structured data, handling tags, etc.

In addition, pay close attention to Google algorithm updates like core updates and product review updates that can suddenly change indexing behaviours and standards. Review all reporting around the timing of significant updates to detect new exclusion patterns.

Addressing issues proactively and quickly is imperative before they spiral into larger declines in pages indexed. If you find significant chunks of your site getting ”Crawled – Currently Not Indexed” after an update, thoroughly audit those pages for quality and technical deficiencies.

Make Google Search Console reporting an integrated habitual part of your workflows, not just a once-in-a-while therapy session. Ongoing monitoring ensures you defend your index coverage gains over the long haul while surfacing specific pages that need attention.

Step 3: Optimise Website Structure for Indexing

Beyond directly enhancing content quality and monitoring indexes, website owners aiming to get fully indexed pages currently marked as “Crawled – Currently Not Indexed” should also examine the underlying structure and architecture fueling internal discoverability.

Optimising your site’s infrastructure and linkage works hand in hand with the search indexing process. Facilitating easy navigation patterns for users and crawlers can amplify pages’ odds of being included in Google’s index and receiving fewer ‘Crawled – Currently Not Indexed’ issues.

Elements to evaluate and improve include:

  • Logical Information Architecture: Organise your site’s content around intuitive main categories, subcategories and topic-specific pages. Ensure pages focused on the same subtopic are logically grouped.
  • Clear Global Navigation Menu: Make primary site sections and essential pages easy to identify in the main navigation across every page. Minimise overreliance on breadcrumbs and footer links.
  • Contextual Interlinking: Link relevant related content together frequently to clarify relationships. But avoid overloading any page with too many links.
  • Prominent Calls-to-Action: Prompt visitors to click through to other helpful content on your site, often through strategically placed buttons, visuals and copy links.
  • Consolidate Thin Pages: Any ultra-specific pages with little unique value should be reworked into more comprehensive parent pages when possible.
  • Limit Depth: Avoid burying pages too far down long chains of nested categories or subdirectories if avoidable. Lengthy paths diminish discoverability.

Revisiting your information architecture, internal linking approach, redundancy management, and general site structure should occur regularly. As your domain grows over time, previously sound structures can become fragmented. Optimising your setup for harmonised discoverability and crawl efficiency amplifies pages’ indexing potential.

Step 4: Manage Duplicate Content Strategically

Another common issue causing ”Crawled – Currently Not Indexed” pages is presenting excessive duplicate content – whether verbatim copied content or content that is just overly similar across different pages.

Google explicitly penalises sites that engage in deliberate duplicate content practices to manipulate search results. However, even accidental identical content issues can signal evaluation problems to Google, diminishing pages’ indexing eligibility and increasing your chances of more ‘Crawled – Currently Not Indexed’ issues.

Here is a strategic approach to duplicate content management:

Audit Your Site for Duplicate Content

Thoroughly audit your site using Copyscape or similar tools to identify any rampant duplicate content issues, whether entirely copied or highly similar page versions. Identify the root causes behind duplicates.

Running a thorough website audit for duplicate content and reworking it around the page’s topic more uniquely will help Google understand your site better, making it easier to rank relevant content.

How to Conduct a Duplicate Content Audit
  1. Use Copyscape or Similar Tools: Enter your website’s URL into Copyscape or any similar plagiarism-checking tool. These tools can scan the web for content that matches yours.
  2. Manual Checks: For more in-depth analysis, manually check your content by copying snippets and searching for them in quotation marks on search engines. This can reveal less obvious cases of duplication.
Familiar Sources of Duplicate Content
  • Multiple URL Variations: The same page is accessible through different URLs (e.g., with and without the ‘www’ prefix, HTTP and HTTPS versions, or URL parameters).
  • WWW vs. Non-WWW: Ensure you have set a preferred domain in your site settings.
  • HTTP vs HTTPS: After switching to HTTPS, redirect all HTTP pages to their HTTPS counterparts.
  • Session IDs in URLs: If your site uses session IDs for users, search engines might index these as separate pages.
  • Printer-Friendly Versions of Content: These can be seen as duplicates if you offer printer-friendly versions of your pages.
  • Syndicated Content: Content you’ve syndicated (or has been syndicated from your site) to other sites without a canonical link can appear as duplicate content.
How to Fix Duplicate Content Issues
  • 301 Redirects: Use 301 redirects to guide users and search engines to the correct version of a page.
  • Canonical Tags: Implement rel= “canonical” tags to indicate the preferred version of a page to search engines.
  • Meta Robots Noindex: Use a ‘noindex’ tag on pages you don’t want search engines to index.
  • Improve Internal Linking: Ensure consistent internal linking so every link to a content piece points to the same URL.
  • Content Rewriting: For pages that are too similar, consider rewriting content to make each page unique and valuable.

After identifying duplicate content, addressing these root causes can significantly improve your site’s SEO performance. Remember, the goal is to provide a better user experience and ensure your content effectively satisfies its intended audience.

Crawled - currently not indexed pages indicate a bigger issue that needs tackling. Our guide will help you sort your pages out.

Address Core Problems

Tackle foundational website architecture problems facilitating duplicates like separate mobile pages, localised variants, outdated migrated pages from old sites, etc., when possible by streamlining infrastructure.

Addressing foundational website architecture problems that lead to duplicate content requires a strategic approach, focusing on streamlining your site’s infrastructure. Here are actionable steps to tackle common issues such as separate mobile pages, localised variants, and outdated migrated pages from old sites:

1. Separate Mobile Pages (m. sites)
  • Solution: Responsive Design. Shift from having separate mobile URLs (often marked as m.example.com) to a responsive web design that adapts to any screen size. This eliminates duplicate content issues and consolidates your SEO strength.
  • Use Canonical Tags. If maintaining separate mobile pages is necessary for a time, ensure each mobile page has a rel= “canonical” link pointing to the corresponding desktop page.
2. Localised Variants
  • Hreflang Tags. For websites serving content in multiple languages or regions, use hreflang tags. These tags tell search engines about the relationship between web pages in alternate languages, helping to serve the correct regional or language version in search results.
  • Consolidate Similar Content. Evaluate if multiple localised pages can be consolidated into fewer pages without negatively impacting user experience. When not possible, ensure content is sufficiently localised to warrant separate pages.
3. Outdated Migrated Pages from Old Sites
  • 301 Redirects. Implement 301 redirects from old URLs to the most relevant current URLs. This not only helps with duplicate content issues but also preserves link equity.
  • Update Internal Links. Ensure all internal links point to the current domain’s pages, not to outdated URLs or the old domain.
  • Audit and Clean Up. Regularly audit your site for outdated content or pages that have been migrated but not correctly redirected or removed. Tools like Screaming Frog SEO Spider can help identify these issues.
4. Streamlining Site Infrastructure
  • URL Parameter Handling. Use tools like Google Search Console to indicate how URL parameters should be treated (e.g., sorting parameters that don’t change content can be ignored by search engines).
  • Consistent URL Structure. Maintain a logical and consistent URL structure across your site to prevent duplicate content due to URL variations.
  • Simplify Navigation and Content Structure. Ensure your site’s architecture is intuitive and straightforward, minimising the need for duplicate pages. Consolidate pages where possible to enhance the user experience and SEO.
5. Technical SEO Enhancements
  • Sitemaps. Regularly update your XML sitemap and ensure it only includes canonical versions of URLs to help search engines crawl and index your site more effectively.
  • Robots.txt. Use the robots.txt file wisely to prevent search engines from crawling duplicate pages or URL parameters that generate the same content.

Addressing these foundational issues can significantly improve your website’s SEO performance and user experience. Remember, the goal is to create a streamlined, efficient site structure that serves users well and is easily understood by search engines.

Key Takeaways - Demystifying SEO

Consolidate Thin Pages

Any extra pages barely covering topics still provide indexing signals. Absorb these less valuable pages into authority content hubs.

Consolidating thin pages into authoritative content hubs is a strategic move that can significantly boost your site’s SEO performance and user experience. Thin content pages often provide little value on their own and can dilute your site’s overall quality in the eyes of search engines. Here’s how to effectively consolidate them:

1: Identify Thin Content Pages
  • Use SEO Tools: Utilise SEO tools like Screaming Frog, Ahrefs, or SEMrush to audit your site for pages with low word counts, low engagement metrics, or minimal traffic.
  • Manual Review: Beyond automated tools, manually review your content to identify pages that may not provide substantial value or fully answer a user’s query.
2: Plan Your Content Hubs
  • Define Your Hubs: Based on your site’s topics, define a set of authoritative hubs that can encapsulate related thin pages. These hubs should be broad enough to cover various aspects of a topic but focused enough to remain relevant.
  • Map Your Content: Create a content map that aligns thin pages to their relevant hubs. Identify which content can be expanded upon and which should be merged or redirected.
3: Enhance and Consolidate content
  • Merge and Redirect: Begin merging thin content into your designated hubs. Update the hub content to incorporate critical points from the thin pages, ensuring a seamless integration that adds value.
  • 301 Redirects: After merging content, implement 301 redirects from the original thin pages to the new, consolidated hub page. This helps preserve link equity and guides users and search engines to a more comprehensive resource.
  • Update Internal Links: Ensure all internal links that previously pointed to the thin pages are now directed to the relevant hub page. This step is crucial for maintaining a coherent site structure.
4: Optimise for SEO
  • Keyword Optimisation: Ensure your consolidated hubs are optimised for relevant keywords, incorporating them naturally into the content, meta titles, and descriptions.
  • User Experience: Focus on the user experience by improving readability, adding relevant multimedia elements, and ensuring the page is navigable and engaging.
  • Quality Over Quantity: Always prioritise the quality of your content over sheer quantity. Your hubs should thoroughly cover the topics, providing real value to your readers.
5: Monitor and Adjust
  • Performance Tracking: Use analytics to track the performance of your consolidated content hubs. Look for improvements in rankings, traffic, engagement, and conversions.
  • Continuous Improvement: SEO is an ongoing process. Regularly review your content hubs for opportunities to update, expand, or further consolidate based on performance data and evolving user needs.

Consolidating thin pages into authoritative content hubs streamlines your website and enhances its value to users and search engines. Creating comprehensive, high-quality content resources can improve your site’s relevance, authority, and visibility in search results.

Designate Preferred Versions

Use canonical tags to signal the dominant version to Google for remaining duplicate content like recap/summary pages or separate formats (videos/list posts).

Altering Near-Duplicates for Better SEO

Alter Near-Duplicates

Selectively delete and replace overlapping elements with new distinguishing content for similar but not precisely copied pages so each stands alone as a unique page.

Altering near-duplicate pages to ensure each page stands as a unique entity is critical for enhancing your site’s SEO and providing value to your visitors. Near-duplicates often occur when content themes overlap without providing distinct perspectives or additional value. Here’s a structured approach to differentiate these pages:

Identify Near-Duplicate Content
  • Conduct a Content Audit: Use SEO tools like Screaming Frog, SEMrush, or Ahrefs to identify pages with similar content. Look for pages with closely related titles, meta descriptions, and identical content lengths.
  • Manual Review: Tools can help identify potential duplicates, but a manual review is essential to understand the context and nuances of the content.
Evaluate the Purpose of Each Page
  • Define Unique Value Proposition: For each near-duplicate page, determine its unique value proposition (UVP). What specific aspect of a topic does it address? Is there an audience segment mainly served by this page?
  • User Intent: Understand the different user intents behind each page. Similar topics can cater to other user queries or stages in the customer journey.
Alter Content to Emphasise Unique Elements
  • Delete Redundant Information: Remove any overlapping content that doesn’t contribute to the UVP of each page.
  • Introduce New Content: Add distinguishing content that emphasises the unique angle or value of the page. This can include new insights, updated statistics, case studies, infographics, or user-generated content.
  • Update Internal Links: Ensure that internal links reflect the unique focus of each page, guiding users to the most relevant content based on their journey.
Improve SEO with Targeted Keywords
  • Keyword Research: Perform keyword research to find unique keywords for each page. These should reflect the specific angle or focus of the content.
  • Optimise On-Page Elements: Update titles, headings, meta descriptions, and content to include the new keywords, ensuring each page is optimised for its unique focus.
Use Structured Data to Differentiate Content
  • Implement Structured Data: Use schema markup to provide search engines with more information about the content and context of each page. This can help differentiate similar pages by highlighting specific attributes like articles, product pages, or event listings.
Monitor and Refine
  • Track Page Performance: Use analytics to monitor the performance of each altered page. Look for improvements in organic traffic, engagement, and conversions.
  • Refine Based on Feedback: Be prepared to refine and adjust content based on user feedback and performance data. SEO is an ongoing process; content must be updated regularly to remain relevant and unique.

Selectively deleting overlapping elements and introducing new, distinguishing content ensures each page provides unique value and stands alone as a distinct entity. This improves your site’s SEO and enhances the user experience by providing more targeted and relevant content.

Reassess Frequently

Continually reassessing your site to scout for new duplicate content issues is a proactive approach to SEO that can save you from significant fixes down the line. Regular monitoring ensures your content remains unique, relevant, and valuable to search engines and users. Here’s how to stay on top of potential duplicate content issues:

1. Schedule Regular Content Audits
  • Set a Routine: Depending on the size of your site and how frequently you update content, set a regular schedule for content audits—monthly, quarterly, or bi-annually.
  • Use SEO Tools: Leverage SEO tools like Screaming Frog, SEMrush, or Ahrefs to automate finding duplicate content across your site. These tools can identify similar page titles, descriptions, and body content.
2. Monitor for Plagiarism
  • Plagiarism Checkers: Utilise tools like Copyscape or Grammarly’s plagiarism checker regularly to ensure your content hasn’t been copied elsewhere on the web and vice versa.
  • Google Alerts: Set up Google Alerts for unique sentences or phrases from your top content to catch instances where your content may be republished without permission.
3. Implement Technical Solutions
  • Canonical Tags: Use canonical tags wisely to point search engines to your site’s original or preferred version of content.
  • Parameter Handling in Google Search Console: Use Google Search Console to inform Google how to handle URL parameters that might generate duplicate content.
4. Keep an Eye on Content Syndication
  • Track Syndicated Content: If you syndicate your content to other sites, ensure they use canonical links back to your site as the source. This helps prevent search engines from viewing syndicated content as duplicates.
  • Syndication Agreements: Regularly review your syndication agreements and practices to ensure they are still in your best SEO interest.
5. Educate Your Content Team
  • SEO Best Practices: Ensure your content creators know SEO best practices regarding duplicate content. This includes understanding how to use canonical tags, the importance of creating unique content, and the right way to repurpose existing content.
  • Content Planning: Encourage your team to maintain a content calendar that helps plan unique and diverse content topics, avoiding unnecessary duplication.
6. Use 301 Redirects for Consolidated Pages
  • Redirect Old URLs: If you consolidate or remove duplicate pages, use 301 redirects to send users and search engines to the most relevant, updated content.
7. Monitor Your Site’s Performance
  • Analytics: Regularly check your site’s analytics for sudden drops in traffic or rankings, which could indicate duplicate content issues or penalties.
  • Webmaster Tools: Utilise Google Search Console and Bing Webmaster Tools to identify crawl errors or messages from search engines that could point to duplicate content problems.

By reassessing your site frequently for duplicate content issues, you can maintain a healthy, SEO-friendly site that ranks well and provides value to your audience. Remember, prevention is always better than cure when it comes to SEO.

Duplicate content requires persistent management. However, cleaning up extensive duplication methodically improves the averaging indexing signals passed to Google across your pages. Eliminate excuses for Google to downgrade or exclude pages due to redundant factors.

Crawled - currently not indexed - Google Ranking

Step 5: Directly Engage with Google’s Indexing Processes

After you’ve cultivated higher quality content, optimised internal site architecture, defended index coverage, and addressed duplication issues, directly interacting with Google’s indexing tools presents another avenue for improving the status of pages marked as “Crawled – Currently Not Indexed.”

Specifically, Google Search Console’s URL Inspection tool can be leveraged to directly bring individual ‘Crawled – Currently Not Indexed’ issues to Google’s attention.

The URL Inspection tool allows you to:

  • Submit Pages for Reindexing: Request pages be freshly re-evaluated for inclusion in search results, especially after significant content improvements.
  • Identify Indexing Blocks: Review specific technical or quality signals hindering indexing for deeper diagnostics.
  • See Live Testing: View Google’s updated crawling, indexing, and serving evaluations in real time to confirm fixes.

Strategically utilising the inspection tool for pages with pending updates or suspected indexing obstacles can accelerate Google’s recognition of improvements you’ve made on formerly subpar pages.

Prioritise pages for inspection requests aligned to top-funnel content for your highest-value keywords and topics. Pages directly satisfying commercial searcher intent also warrant extra indexing attention when possible.

The URL inspection tool lets you directly spotlight individual page indexing quirks to Google. Proactively highlighting your upgraded pages can help shift them out of languishing as merely crawled and into becoming fully indexed by Google.

Additional ‘Crawled – Currently Not Indexed’ Factors to Consider

  • Internal Linking: Enhance your site’s internal linking structure by linking to orphaned pages from relevant content on your site. This helps signal their relevance and importance to Google.
  • Thin Content: Expand on topics by adding more detailed and valuable information to match or exceed the depth of content found in competing search results.
  • Search Intent: Ensure your content aligns with the search intent of your target queries. If necessary, adjust your content strategy to better match what users are looking for.
  • Duplicate Content: Address near-duplicate content by adding a canonical tag or revising it to make it unique and valuable.
  • Technical Issues: Regularly check for and resolve technical issues that may prevent indexing, such as errors in structured data or issues with 301 redirects.

Conclusion and Final Recommendations

Getting your site’s pages out of Google Search Console’s dreaded “Crawled – Currently Not Indexed” status ultimately requires a multi-pronged strategy addressed through different stages:

  • Stage 1) Revitalise Content Quality
  • Stage 2) Master Index Monitoring Vigilance
  • Stage 3) Optimise Website Architecture
  • Stage 4) Clean Up Duplicate Content
  • Stage 5) Directly Push for Reindexing

Working through focused improvements at each stage gives formerly underperforming pages a significant opportunity to better comply with Google’s indexing standards.

The specific reasons for pages being ”Crawled – Currently Not Indexed” depend on the individual page’s circumstances. However, in aggregate, these universal stages capture the most likely obstacles and solutions.

Measure your success through consistent monitoring for actual changes in individual page indexing status over time. Patience is required as shifting pages from only crawled to fully indexed relies on Google’s reevaluation cadence.

Leverage Google Search Console data to track adjustments in your site’s overall index coverage percentage and number of pages escalated out of the “Crawled – Currently Not Indexed” designation.

Celebrate incremental indexing gains as validation your comprehensive strategy is working. Let ongoing results guide future stages needing reinforcement.

Committing to sustainable SEO excellence, summarised in these five stages, gives pages their best shot at going from unseen to standing out in Google’s competitive rankings. Log consistently toward executing this ‘Crawled – Currently Not Indexed’ to indexed graduated journey.