Technical SEO Best Practices

50 tactical tips to improve your rankings

Hello to every unicorn in the galaxy.

Today’s topic is boring but important:

50 technical SEO best practices

If you want SEO traffic, you need to maintain a healthy website that’s optimized for organic traffic.

We’re going to cover the following 5 topics:

  1. Facilitate proper crawling

  2. Provide a clean site structure

  3. Optimize the user experience

  4. Consolidate page authority

  5. Monitor & maintain site health

Each of this will help ensure your site is optimized for maximum exposure in the SERPs.

If you’re into SEO, check out this guide with 35 on-page SEO best practices.

Let’s get started…

1. Facilitating proper crawling

Submit XML sitemaps

  • Create a sitemap index file named sitemap.xml in your root domain folder. This file should list each of your individual XML sitemaps.

  • Create individual XML sitemap files that end in .xml (e.g. sitemap1.xml, sitemap2.xml). These can be named anything but should use the .xml file extension.

  • In each sitemap file, list the URLs for a subset of pages on your site, including the last modified date for each URL.

  • Submit the sitemap index file to Google Search Console by going to "Sitemaps" under the "Index" section.

  • You can ping Google to notify them of the new sitemap, or they will automatically discover it within a few days.

  • Submitting sitemaps through Search Console allows efficient crawling of the listed URLs to keep Google's index up to date.

  • Regenerate your sitemaps and resubmit periodically as you add or update site content.

Indicate crawl budget in robots.txt

The robots.txt file can be used to manage your site's crawl budget by limiting or allowing access to specific sections or page types.

To indicate crawl budget, you first need to understand which parts of your site are high vs low priority for indexing.

Lower priority pages like confirmation pages, print views, or infrequently updated sections should be restricted to conserve crawl budget.

High priority sections like product or blog content should be allowed full access.

In your robots.txt file, use "Disallow" directives to block crawl access to subsets of lower priority pages, and "Allow" directives to explicitly permit crawling of key sections.

For example:

Disallow: /checkout/confirm/
Disallow: /print-versions/

Allow: /blog/
Allow: /products/

By thoughtfully indicating crawl budget priorities in your robots.txt, you help focus Google's limited resources on indexing your most important pages instead of wasting crawl budget on unimportant areas of your site.

Monitor crawl stats to optimize directives over time.

Eliminate crawl errors and 404s

Crawl errors and 404 pages indicate issues that frustrate users and waste precious crawl budget. Fixing them should be a priority for technical SEO.

  • Review crawl error reports in Google Search Console to identify any pages returning 404 status codes or other errors like timeouts or redirects.

  • For each error, determine if the page should be reinstated or removed.

  • Reinstate valuable pages by restoring the content or implementing proper redirects. Unimportant outdated pages should redirect to relevant sections of the site or to the home page.

  • Eliminate any broken links directing users to error pages.

  • Monitoring logs and enabling alerts can help detect future crawl errors quickly.

By systematically eliminating crawl errors and 404s, you improve site cohesion and allow more crawl budget to be allocated to indexing quality pages that add value for users. The result is better crawl efficiency and user experience.

Avoid crawling unimportant pages

Not all pages on a website are equally important to be indexed. Crawling unimportant pages wastes crawl budget that could be better spent on priority content.

Examples of low-value pages include order confirmation receipts, print versions of articles, pages with thin content, or sections that are rarely updated.

Use the robots.txt file to block search engines from accessing these less important portions of your site. For example:

Disallow: /checkout/thank-you/
Disallow: /print-versions/
Disallow: /product-category/outdated-models/

You can also use the "noindex" meta tag on individual low-value pages.

Alternatively, password protect or IP restrict access to unimportant sections.

By selectively preventing search engines from crawling pages with little value to users, you conserve crawl budget for indexing pages that do deserve prominence in search results.

Continually evaluate site sections to identify new areas that can be crawled less frequently.

Allow crawling of important pages

While some pages should be restricted, it's crucially important to enable full crawling access to priority pages on your site, such as blog posts, help documentation, new products, etc.

Using the "Allow" directive in robots.txt indicates sections search engines should crawl more intensively. For example:

Allow: /blog/ Allow: /help-center/ Allow: /products/new-releases/

This focuses crawl budget on indexing your most important and regularly updated pages.

You can also use the "noindex, follow" meta tag to hide certain pages from search results while still allowing crawling of links.

Allowing access helps search engines efficiently discover new content on important sections as it is added.

As a best practice, enable full crawling of high-value pages to maximize their visibility in search results while protecting less important areas of your site.

Set up proper robots.txt directives

The robots.txt file provides instructions about what parts of your site can be crawled and indexed.

To implement proper robots.txt directives, first determine which pages or sections should be restricted or allowed access.

Lower priority pages like checkouts, print views, or infrequently updated categories are good candidates for restriction.

High priority sections like blogs, help centers, or product categories should be allowed full access.

Use "Disallow" directives to block or restrict crawler access to less important sections:

Disallow: /checkout/
Disallow: /print-views/

Use "Allow" directives to explicitly permit crawling of key areas:

Allow: /blog/
Allow: /help-center/

You can disallow specific filetypes, like blocking image crawling with:

Disallow: *.jpg$

Test your robots.txt using online validators to catch errors. Keep directives focused on managing crawl budget, not blocking pages from indexing. Monitor search analytics to optimize directives and prioritize crawling of your most important pages.

Minimize use of noindex tag

The noindex meta tag prevents pages from appearing in search results.

Minimizing use of noindex ensures search engines can fully index your important content.

Avoid applying noindex broadly across entire site sections.

Instead, use selectively on specific low-value pages like order confirmations, print versions, or pages with duplicate content.

  • Rely on robots.txt directives to manage crawl budget rather than blocking indexing site-wide with noindex.

  • Evaluate any existing use of noindex tags - removing from pages that do provide value to users.

  • Only maintain noindex on pages not suitable for search, such as transactional confirmations.

Minimizing noindex improves index coverage of your priority pages. It signals to search engines that these pages are valuable and deserve to rank well in results.

Allowing pages to be indexed enables you to fully leverage them for organic visibility and traffic growth.

Fix broken internal links

Broken internal links frustrate users when clicking on them returns error pages. They also waste crawl budget on pages with no value.

To fix, first run site crawls and check analytics to identify broken links and referrer pages directing users to errors.

Determine if the broken destination page should be reinstated, redirected to a new location, or removed entirely.

  • Update the referring content to point to functioning pages.

  • Eliminating broken internal links enhances user experience by avoiding dead ends and improves information architecture cohesion.

  • It also allows crawl budget to be allocated to indexing quality pages rather than wasted on errors.

By providing working links between related content, internal linking structure becomes more useful for users and supports better crawler navigation of priority site sections.

Maintaining a structure of fixed, functional internal links should be an ongoing technical SEO priority.

Optimize page speed for crawling

Slow page speeds negatively impact crawling efficiency and user experience.

Optimizing page speed makes a site more crawl-friendly so search engines can better index its pages within allocated budget.

  • Best practices include minifying HTML, CSS, and JavaScript files, compressing images, eliminating render-blocking JavaScript, and using a content delivery network to cache assets.

  • Reducing server response times and enabling compression also improves crawl efficiency.

  • Faster page speeds allow bots to crawl more of your important pages within the same crawl budget.

  • Optimized speed also lowers bounce rates for users.

  • Google factors page speed into rankings, so improving performance can directly improve search visibility.

  • Make page speed optimizations part of regular technical SEO audits.

Maintaining fast load times facilitates more effective bot crawling and creates a better user experience on your site.

Control rate of requests to server

If your site gets crawled too aggressively, it can overload your server and cause slowdowns or outages.

Rate limiting allows you to control the speed and volume of crawl requests to prevent disruptions.

In robots.txt, use the "Crawl-delay" directive to specify a delay time in seconds between requests. For example:

Crawl-delay: 10

This forces crawlers to wait 10 seconds between page fetches.

You can apply different delays based on site section - allowing faster crawling of key pages like products or blog.

For more advanced control, server-side solutions like dynamic throttling adjust crawl rate based on real-time server load. Intelligently controlling crawl request rates protects site performance while still allowing timely indexing of new content.

Monitor crawl stats and server logs to optimize crawl delay values and ensure a smooth crawling experience.

Prevent crawling of duplicate content

Duplicate content on a site can divide page authority and allow lower quality pages to rank in search results.

To prevent wasting crawl budget on duplicative or thin content, use robots.txt directives to block or restrict access to secondary copies.

For example:

Disallow: /print-versions/
Disallow: /product-pages-old/

This focuses crawling on the canonical version and limits indexing multiple copies.

  • Noindex tags can also be selectively applied to additional variant pages when disallowing isn't an option.

  • For necessary duplicates like regional/language versions, use hreflang tags to indicate alternate pages.

  • Preventing duplicate content crawl saves resources for indexing original high-quality pages.

It helps search engines understand which page truly deserves ranking authority for a given piece of content.

Limiting duplicate indexing through smart crawling facilitates optimal rankings.

Enable compression to optimize crawling

Enabling compression for your web pages reduces file sizes and speeds up crawling.

This allows bots to fetch more pages per session within allocated crawl budget.

  • Configure your server for GZIP compression to compress HTML, JavaScript, CSS and other text-based files.

  • Image compression reduces filesize of JPEGs and PNGs without degrading quality.

  • Browser caching directives tell crawlers to store compressed files for faster performance.

  • Compression saves bandwidth and optimizes page loading speed.

  • Faster page loads improve user experience and search engine accessibility.

  • Make sure to test compressed pages to avoid rendering issues.

Proper implementation of compression increases crawl efficiency so more of your site can be indexed and available in search results.

Monitor crawl stats and site speed to benchmark compression improvements over time.

Limit access to pages through password protection or IP whitelisting

Sometimes it is necessary to restrict access to certain pages on your site, such as for private content or admin interfaces.

Password protection and IP whitelisting limit access to only authorized users, while still allowing search engines to crawl and index the content. To password protect pages, use HTTP authentication to prompt login before displaying the page content.

For IP whitelisting, configure server or firewall settings to only allow specified IP addresses to access restricted pages or directories.

Neither method will block page crawling.

However, restricting human access avoids security risks while still permitting search engine indexing.

Evaluate whether password protected or IP restricted pages provide any real value to users in search results.

If not, blocking their indexing with a "noindex" tag or robots.txt may be preferable.

But if the content is useful when discovered organically, limiting access through whitelisting or passwords achieves safety without blocking crawlability.

Make site navigation easy to crawl

Site architecture and navigation should facilitate crawling by clearly showing the relationships between pages and content.

  • Using a simplified information hierarchy with logically structured URLs makes internal link patterns easy to decipher.

  • Implement contextual breadcrumb trails and site maps to enhance understanding of page focus.

  • Ensure all pages are reachable by following links - orphaned pages are hard to discover.

  • Use descriptive anchor text and strategic linking to highlight connections between related content.

  • Avoid overloading pages with excessive links. Excessively deep hierarchies tire users and bots.

Optimizing navigation for crawlability leads to better indexation of all pages. It also enhances user experience by making useful content simple to find.

Check crawl stats to identify poorly reached sections for navigation improvements.

Smooth site navigation keeps both users and bots engaged.

Avoid crawling pages with crawling restrictions

Some pages have technical restrictions that prevent proper crawling, like pages behind forms or restrictive robots.txt files.

  • Attempting to crawl these results in wasted budget on inaccessible content. Identify pages with crawl restrictions through site audits and Search Console errors.

  • Determine if the restrictions can be removed or modified to allow access.

  • For example, implementing AJAX rendering can allow JavaScript heavy pages to be crawled.

  • If restrictions are unavoidable, use robots.txt or noindex tags to block their indexing.

  • This avoids wasted crawl budget. Inform search engines these pages lack value via the "Unavailable_After" HTTP status code.

  • Monitoring crawl accessibility helps focus efforts on indexable pages.

  • Avoiding restricted content keeps bots from wasting time on pages they cannot fully process or index.

The crawl budget can instead go to better reaching more useful pages.

2. Providing clarity on site structure:

Use descriptive URLs and filenames

Descriptive URLs and filenames help search engines understand the topic and purpose of a page without having to process all its content.

For example, "www.example.com/products/green-widget" is more informative than "www.example.com/p1234".

  • URLs with relevant keywords and file names with dashes for spacing are recommended.

  • This also creates a sensible site architecture for users navigating the site.

  • Avoid overly long, duplicated or generic filenames like "page1.html".

  • Check for exact duplicate URLs which only one page should retain.

  • Implement 301 redirects from old URLs to new versions.

Unique, descriptive URLs and filenames allow search engines to quickly grasp page focus when crawling and determine relevance for a search query.

This results in pages being properly indexed and ranked for applicable user intent.

Implement logical internal linking structure

A logical internal link structure makes relationships between pages clear and facilitates crawler navigation of a website.

  • Links should connect related content, not be randomly placed.

  • Structure sections using hub and spoke models with category pages linking to product/content pages.

  • Link to related content to provide context - like connecting blog posts in the same series.

  • Use links in the body text, not just menus and footers.

  • Ensure all pages are linked to from at least one other page to avoid "orphaning".

  • Establish link paths between top and lower level pages by using breadcrumb trails.

  • Check for broken internal links regularly and repair or remove them.

An intuitive linking structure assists crawlers in understanding the architecture and informs rankings based on page relationships.

It also creates a better user experience allowing them to easily navigate and discover useful information.

Use breadcrumb schema markup

Breadcrumbs provide a trail of links to help users understand the path to the current page within the site hierarchy.

Adding breadcrumb schema markup enhances this for search engines as well.

The JSON-LD schema outlines the crumbs as a list of linked pages from the root URL to the current one. For example:

{"@context": "https://schema.org","@type": "BreadcrumbList","itemListElement":[

{"@type": "ListItem","position": 1,"name": "Books","item": "https://example.com/books"},

{"@type": "ListItem","position": 2,"name": "Science Fiction","item": "https://example.com/books/sciencefiction"},

{"@type": "ListItem","position": 3,"name": "Book Title","item": "https://example.com/books/sciencefiction/booktitle"}]}

Adding breadcrumb schema provides contextual clues about the current page to search engines during crawling and indexing.

This can lead to better categorization in results for applicable searches.

It also enhances site navigation for users.

Optimize use of headings tags

Proper use of heading tags (H1, H2, H3, etc.) is vital for both users and search engines.

  • Headings break up blocks of content, making it easy to scan.

  • They also convey semantic structure, clarifying what sections of content cover.

  • Only one H1 should be used per page - this indicates the main topic focus.

  • Lower level headings further break this down into subtopics.

  • Check that headings adequately reflect page content and hierarchy.

  • Don't "keyword stuff" with phrases that don't match the content.

  • Use headings consistently across site sections.

  • Optimized headings make page content easy to parse for crawlers and users alike.

They facilitate extracting meaning from pages during indexing.

Headings also allow creating an accurate page outline that search engines may display in results.

Ensure pages aren't orphaned

Orphaned pages have no links directing users or search engines to them, causing issues with discoverability.

  • To identify orphans, audit your site architecture and check site crawl reports.

  • Add contextually relevant links to orphan pages from related content.

  • If no logical links fit, ensure they are linked in site footers or sitemaps.

  • Pages that truly have no related content may not be useful to optimize.

  • Another option is to 301 redirect orphaned pages to a relevant section of the site.

Eliminating the orphan status facilitates discovery by search engines, leading to improved indexation and rankings possibilities.

It also enhances user experience by connecting the orphan pages into the site's information architecture through contextual internal links.

Maintaining an interconnected site structure should be an ongoing initiative as new content is added over time.

Set preferred domain with canonical tag

When your site content is accessible on multiple domains, the canonical tag indicates the preferred URL to direct link equity towards.

For example, your site is reachable via both example.com and www.example.com.

Adding <link rel="canonical" href="https://example.com"> tells search engines to treat example.com as the authoritative version.

This consolidates page authority to the preferred domain, avoiding dilution across versions.

  • Using the canonical URL tag on every page prevents duplicate content penalties spreading rankings across domains.

  • Point global canonicals to the domain you wish to focus optimization efforts on.

  • Consistently signaling the authoritative domain makes sure search engine indexes and users access the intended authoritative pages.

It also reduces the potential for issues from duplicate content across domains.

Avoid duplicate titles and meta descriptions

Having identical or overly similar titles and meta descriptions across multiple pages dilutes the clarity of purpose for each page in search engine results.

  • To avoid duplication, ensure title tags and meta descriptions are customized to accurately reflect the topic and unique value of each document.

  • Check for reuse of verbatim phrasing across pages.

  • Use title tag modifiers like site name, category, location, or other descriptors to differentiate pages covering the same general subjects.

  • Similarly, meta descriptions should use alternative wording and focus on the distinct aspects rather than copying generic descriptions.

Unique, meaningful page titles and descriptions help search engines understand the specific content and improves click-through rates by enabling users to identify the most relevant result.

Distinct meta data is a vital technical optimization.

Indicate geospecific pages with hreflang

Hreflang tags tell search engines about alternative language or regional versions of a page.

Adding hreflang avoids duplicate content issues by specifying which page targets which location/language. For example:

<link rel="alternate" hreflang="en-us" href="https://example.com/page">

<link rel="alternate" hreflang="en-gb" href="https://example.co.uk/page">

This indicates the same content optimized for the US and UK audiences.

Using hreflang where applicable ensures pages surface for the right users based on geo-targeting and languages.

It prevents duplicate content penalties by signaling there are intentional alternative versions of the page for different regions rather than full duplicates.

Properly tagging and targeting geospecific content improves global indexation and traffic.

Implement pagination properly

Pagination divides long lists of content across multiple pages. To implement it correctly:

  • Use descriptive URLs like /blog/page1 rather than just numbers.

  • Link page numbers sequentially and include previous/next links.

  • Indicate on the first page that it continues on subsequent ones.

  • Use rel="next" and rel="prev" attributes to signify page relationships.

  • Limit the number of pages that can be crawled with robots.txt or noindex.

Proper pagination makes it easy for crawlers to understand sequencing and crawl additional pages.

It also enables users to navigate seamlessly through all the content.

Poor pagination structure like too many pages or unclear relationships hinders indexing.

Optimized pagination expands the amount of content search engines can include for a particular subject.

It provides a better experience for users as well. Pagination should be built with both bots and visitors in mind.

Use descriptive anchor text for internal links

Anchor text used for internal links should provide context and describe the content being linked to.

For example, link the text "customer support forum" to your forums page rather than just saying "click here".

  • Descriptive anchor text helps search engines understand the focus of linked pages and how content relates to each other.

  • It also enables users to better evaluate the utility of a link before clicking it.

  • Avoid using exact match anchor text like your target keywords across all links - this appears unnatural.

  • Ideally the anchor text will be unique and contextual for each link on a page.

This natural internal linking structure benefits SEO by signaling relevance for ranking purposes.

It also improves usability by clarifying for users where a link will take them on the site.

3. Optimizing user experience:

Make site mobile-friendly

With increasing search traffic coming from mobile devices, having a mobile-friendly site is essential for both users and search visibility.

There are various best practices to implement:

  • Use a responsive design to dynamically adapt layout on any device.

  • Avoid software that blocks crawling of mobile pages.

  • Design with tap targets and spacing for touchscreens in mind.

  • Keep page loading fast - optimize images, minimize redirects, etc.

Making these mobile optimizations improves user experience and engagement on mobile devices.

It also enables proper crawling and indexing by Googlebot's smartphone user agent.

Mobile friendliness is a ranking factor - sites not optimized for mobile can suffer in search results.

Conduct testing across devices and use Google's mobile-friendliness tool to identify any issues.

Maintaining a functional and fast mobile site should be an ongoing technical SEO priority given the massive mobile search market share.

Improve site speed and page load times

Optimizing site speed makes pages load faster for users and facilitates more efficient crawling for search bots. Some ways to improve speed include:

  • Minify HTML, CSS, JS and compress images.

  • Leverage browser caching and use a CDN to store assets.

  • Remove render-blocking elements and unnecessary redirects.

  • Upgrade to faster web hosting and implement server caching.

  • Defer offscreen images/scripts.

  • Optimize databases and inefficient code causing slowdowns.

Faster page loads enhance user experience leading to better engagement metrics like lower bounce rates.

Quick loading also allows bots to crawl more pages within the same crawl budget.

Improved speed has become a factor in search rankings as well.

Conduct audits using PageSpeed Insights and measure metrics like TTFB to benchmark speed improvements over time.

Optimizing performance is essential for Technical SEO in modern search landscapes.

Fix broken pages and 404 errors

Broken pages or those returning 404 status codes should be addressed as they negatively impact users and waste crawling resources.

  • Identify broken pages through site audits and Search Console error reports.

  • Determine if the content should be reinstated, properly redirected, or removed entirely.

  • Redirect broken pages to relevant working content when appropriate.

  • Eliminate broken internal links pointing users to 404s.

  • Implement custom 404 error pages that offer suggestions and site navigation to reduce bounce rates.

  • Monitoring site crawls and server logs helps catch errors proactively.

  • Fixing broken pages and 404s provides users with a fully functional site.

It also enables search engines to discover and properly index all working content without wasting efforts on defunct pages.

Maintaining an error-free site enhances organic performance by improving crawl efficiency and user experience.

Use structured data for rich results

Structured data provides additional context about page content to search engines through schema markup.

This enables rich result formats like star ratings, images, or pricing in the SERPs.

Some common schema types are product, recipe, review, event, and breadcrumbs. The most widely used format is JSON-LD. For example:

<script type="application/ld+json">
{
"@context": "https://schema.org/",
"@type": "Product",
"name": "Widget",
"image": "widget-img.jpg"
}
</script>

Adding appropriate schema markup to pages enables search engines to generate rich snippets, carousels, or other engaging result formats.

This can improve click-through rates. Structured data also amplifies pages by presenting key information directly in the SERPs.

Optimizing use of schema where applicable enhances how search engines present your content.

Format content for easy reading and skimming

Formatting content in a scannable way allows users to quickly extract information and improves reading experience.

Best practices include using brief paragraphs, highlight key points with bullet lists, incorporate meaningful subheadings, and break up text with relevant images.

  • Ensure font sizes and contrast make text easy to read.

  • Avoid walls of dense text.

  • Use bold and italic formatting for emphasis selectively.

  • Optimizing content for on-screen consumption enhances engagement metrics like time on page.

  • For search engines, properly formatted content makes it easier to parse and understand page focus during indexing.

Formatting also provides necessary visual cues to algorithmically evaluate aspects like authority, expertise, trustworthiness.

Presenting content in a consumable way improves perception for both users and search bots.

Optimize images and video for faster loads

Large file sizes for images and videos can significantly slow down page load speeds. There are various optimizations to implement:

  • Compress JPEG, PNG, GIF files to reduce filesize without losing quality.

  • Use optimal image dimensions - resize based on layout, crop to focus on key subjects.

  • Set height/width attributes on image tags to prevent layout reflow as assets load.

  • Enable lazy loading to only load offscreen images when scrolled into view.

  • Deliver videos from a media host like YouTube instead of self-hosting large files.

  • Use MP4 format and compression for videos.

Optimized assets reduce page weight, enabling faster loading for users.

Quicker page speeds allow bots to crawl more of your site within crawl budgets.

With page speed being a ranking factor, optimizing images and video improves organic search visibility.

It also leads to better user experience and engagement across devices.

Improve site navigation and information architecture

An intuitive site navigation and information architecture (IA) allows both users and search bots to easily find and access relevant content.

Best practices include:

  • Organize content in a logical hierarchy based on topics and relationships.

  • Implement clear and consistent navigation elements across all pages.

  • Use descriptive menus, categories, and metadata to convey page focus.

  • Ensure internal links connect related content together.

  • Provide site search and sitemaps to facilitate content discovery.

  • Don't bury key pages deep in complex IA hierarchies.

Improved navigation makes it easier for search engines to crawl and categorize content.

It also enables users to locate information more efficiently. Poor site architecture hinders indexing and creates user experience issues.

Structuring information intuitively should be an ongoing focus for overall findability and SEO success.

Create compelling titles and meta descriptions

Page titles and meta descriptions represent content in search results, so optimizing them is crucial.

Effective titles attract user interest plainly explaining the page focus.

  • Add keywords naturally and use branding/modifiers for uniqueness.

  • Meta descriptions should provide a snippet that motivates clicks - highlight value propositions or key differentiators.

  • Avoid excessive length and duplication.

  • Compelling page titles and descriptions improve click-through rates by helping users identify the most relevant search result.

  • For SEO, they characterize page content to search engines, aiding categorization. Formatting titles in question formats may also boost engagement.

Continually refine page titles and meta data to maximize search traffic. Keep them concise and descriptive to drive clicks and enhance how search engines interpret pages.

Migrate site to HTTPS

Migrating to the HTTPS encrypted protocol provides security and trust assurances for users.

  • From an SEO standpoint, Google has made HTTPS a ranking signal - sites on HTTPS benefit versus HTTP counterparts.

  • To migrate, first obtain an SSL certificate and install it on your servers. Update site code to use https:// URLs and adjust links accordingly.

  • Set up 301 redirects from old HTTP pages to new HTTPS versions to transfer authority. Update sitemaps and change references in analytics tools.

  • Test thoroughly before go-live to catch any issues with third-party scripts, images, etc not loading.

  • Moving to HTTPS shows users the site is secure and improves organic search visibility.

HTTPS enables full use of browser caching, speeding up page delivery. It is also necessary to take advantage of features like service workers or HTTP/2 on your site.

Minimize use of pop-ups and interstitials

Pop-ups, overlays and interstitial advertisements that block page content frustrate users and make it difficult for search engines to properly analyze pages.

  • Avoid intrusive pop-ups that overwhelm the user experience.

  • If necessary, time delays on when they display and include clear dismissal/close functionality.

  • Eliminate pop-up forms that hinder access to main content.

  • Interstitial splash pages should not disrupt the transition from search result to destination page.

  • Minimal use of pop-ups and interstitials improves crawlability and indexing by keeping page content easy to access.

  • It also provides a smoother user experience encouraging engagement rather than blockage and abandonment.

Any critical information or calls-to-action should be incorporated into page design without obtrusive pop-ups hampering usability.

Make forms easy to use on mobile

With increasing mobile traffic, having forms that are user-friendly on smartphones and tablets is essential.

  • Use responsive design so form elements resize and stack properly on smaller screens.

  • Expand tap targets for buttons and links to facilitate tapping.

  • Minimize character input requirements that are tedious on mobile keyboards.

  • Avoid dropdown selections where type-ahead input fields could suffice.

  • Implement browser autocomplete attributes to simplify filling known info.

  • Validate forms prior to submission to avoid errors or lost progress.

  • Optimizing forms for mobile makes it easy for visitors to successfully convert and improves analytics.

  • Enhanced user experience also positively impacts organic rankings and traffic as mobile usage grows.

Testing forms across devices and addressing usability pain points should be an ongoing priority.

Design for accessibility

Designing sites to be accessible for those with disabilities enhances experience for all users.

Site accessibility best practices include:

  • Add ARIA landmark roles to convey semantic page regions

  • Write descriptive alt text for images conveying context

  • Ensure proper contrast ratios for text readability

  • Make links, buttons, and media elements keyboard focusable

  • Use semantic HTML headings and lists for screen readers

  • Support screen magnifiers with fluid layouts

  • Provide captions and transcripts for audio/video content

Optimizing for accessibility facilitates usage by disabled visitors relying on assistive devices and software.

More robust experience for all users also improves organic performance - driving more quality traffic and engagement.

Making content and functionality available to the widest range of visitors should be an ongoing initiative as part of technical SEO.

4. Consolidating page authority:

301 redirect old URLs

When pages are moved or restructured on a site, 301 redirects should be implemented to pass link equity from old URLs to the new destinations. For example:

301 /old-page.html https://www.example.com/new-page

301 redirects are permanent, signaling to search engines that the page has permanently moved versus being temporarily unavailable.

This transfers any authority and indexation from the old URL over to the new one.

Without 301s, moving or changing URLs can result in dead links, lost rankings, and dilution of authority.

Setting proper 301s consolidates authority and preserves organic visibility when making changes to site content or architecture.

Using the specific 301 code tells search engines the redirect is intentional and permanent.

Canonicalize duplicate content

Duplicate content on a website can divide authority across multiple versions of the same content.

Implementing canonical tags indicates to search engines which URL represents the primary, authoritative version that other copies should defer to. For example:

<link rel="canonical" href="https://www.example.com/page">

This specifies example.com/page as the canonical URL.

The tag is placed on all duplicate versions to consolidate signals like link equity to the main version.

Without canonicalization, duplicate content results in diluted page authority and search rankings.

It can also trigger penalties for over-optimization.

Canonical tags clarify which URL search engines should prioritize in indexing and ranking.

This avoids divided authority and focuses optimization power on the target URL.

Avoid redirect chains

Redirect chains occur when a URL redirects to another URL which then redirects to another page, creating a sequence of multiple redirects before landing on a final page.

  • These long chains slow down page loading and create a poor user experience.

  • For SEO, redirect chains can result in diluted page authority and ranking problems.

  • To avoid, analyze site redirects to identify chains.

  • Minimize unnecessary hops by consolidating multi-step redirects into direct single redirects wherever possible.

  • For necessary sequences, limit chains to a maximum of 3-4 steps.

  • Eliminating excessive redirect chains improves site speed and prevents crawling issues for search engines.

  • Consolidating to direct or shorter chains also funnels link signals directly to target pages.

Avoiding overly long series of redirects enhances technical SEO foundations.

Remove toxic or unnatural links

Low-quality links from spammy or irrelevant sites can harm a domain's authority and lead to algorithmic penalties.

These "toxic" links should be identified and disavowed or removed.

Check site backlinks using tools like Ahrefs to uncover suspicious patterns or obvious low-quality links.

Reach out to site owners requesting removal of harmful links pointing to your domain.

Create a disavow file of bad links that cannot be removed to instruct Google not to count them.

Continually monitor backlinks and disavow new toxic links as needed.

Eliminating shady links helps maintain a natural link profile that adds trust and authority rather than negatively impacting rankings.

While quality links should be cultivated, identifying and cleaning up unnatural links preserves search visibility and prevents penalties.

5. Monitoring and maintaining site health:

Regularly audit technical SEO issues

Websites constantly evolve and change over time, resulting in new technical SEO issues arising.

Conducting regular technical audits identifies problems proactively so they can be fixed before significantly impacting performance.

Some key areas to audit include site crawlability, indexation, page speed, structured data, links, duplicate content and more.

  • Use tools like Google Search Console, ScreamingFrog, and Google PageSpeed Insights to surface technical problems.

  • Have an ongoing schedule for comprehensive audits quarterly or biannually.

  • Monitoring critical issues more frequently like 404 errors and site speed is also wise.

  • Staying on top of technical SEO prevents critical errors and maintains strong site health.

It also informs opportunities to further optimize performance.

Regular audits help proactively identify and resolve technical problems before users or search engines are negatively impacted.

Check search appearance for enhancements

Monitoring how your important pages appear in search results can reveal opportunities for enhancements to improve click-through rates.

  • Check title tags and meta descriptions for pages ranking for target keywords. Ensure they are compelling and click-worthy.

  • Review if sitelinks, images, or other rich results are being shown - add schema markup to gain rich snippets where relevant.

  • Check for hidden indexed pages that should be removed from results.

  • Analyze what competitors are doing that may inspire tests on your own site.

  • Reviewing search appearance enables optimizing pages to increase click-through rates from organic listings.

  • It also helps maximize the in-SERP presence through additional elements like sitelinks.

Gaining insight into how your site displays in results should inform ongoing SEO efforts to enhance visibility, clicks, and traffic from organic search.

Analyze Google Search Console reports

Google Search Console provides invaluable data to inform technical optimization.

Regularly review core reports including indexing, crawl stats, sitemaps, and search analytics.

  • Index coverage reveals pages Google can't access to improve crawlability.

  • Crawl stats show errors and blocked pages needing fixes.

  • Sitemaps verify your pages are being processed.

  • Search analytics indicates query and click volume to optimize pages for visibility and engagement.

  • Additional reports like security issues, markup, and enhanced experience identify areas for improvement.

  • Thoroughly analyzing Search Console data identifies both critical errors needing resolution and opportunities to enhance organic performance through better indexing, speeds, etc.

These insights should directly inform priorities and tests to elevate technical SEO.

Maximizing use of Search Console reporting improves visibility in SERPs and website functionality for long-term success.

Monitor crawl stats and errors

Crawl stats provide vital signals about issues blocking search engines from properly accessing and indexing your site.

  • In Google Search Console, regularly review crawl stats reports for crawl budget spent, page errors encountered, server response codes, etc.

  • Analyze for spikes in errors or unexpected dips in pages crawled to diagnose problems.

  • Study error log details to identify specific problematic URLs.

  • Monitor daily for critical issues like server errors or security warnings.

  • For other data like 404s, do weekly or monthly checks for minor problems accumulating over time.

  • Act swiftly to address serious technical errors revealed through crawl stats.

Ongoing monitoring ensures technical problems get flagged early before they scale into major visibility or traffic drops.

Watching crawl data helps preserve strong technical SEO foundations to support organic performance.

Review index coverage reports

Index coverage reports in Google Search Console show what percentage of site pages Google has included in its search index.

Monitoring this reveals pages search engines are struggling to access so improvements can be made.

  • Review index coverage regularly to check for declines warranting investigation.

  • Drill into URL details to identify pages Google can't crawl, like those blocked by robots.txt.

  • Diagnose why important pages are excluded from indexing, such as site architecture issues prohibiting crawling.

  • Improve technical factors like site speed and internal linking to facilitate access.

  • Submit fetch as Google requests for critical unindexed pages.

  • Aim for nearly 100% of quality pages indexed.

  • Lacking index coverage points to technical obstacles limiting SEO visibility.

Continually reviewing index reports ensures critical content gets included in search results, maximizing reach for target keywords and topics.

Check performance of key rankings

Monitoring your website's rankings for important target keywords is crucial for gauging SEO success over time.

  • Regularly check rankings in Google for your top 10-20 keywords using incognito mode.

  • Compare against historical performance and goals.

  • Look for patterns like consistent improvements or volatility indicating technical issues.

  • For declines, diagnose potential causes like site migrations, quality updates, or page speed problems.

  • For core brand terms, aim for top 3 rank.

  • Review ranking trends of competitors for comparison context.

  • Track rankings in ranking tracker software to automatically surface declines needing intervention.

  • Consistently tracking key terms provides visibility into technical optimization efficacy.

By closely monitoring changes, you can quickly catch and troubleshoot technical factors negatively impacting performance for important keywords.

This helps sustain and build upon hard-earned SEO visibility.

Connect Google Analytics to track impact of changes

Connecting your Google Analytics account provides valuable tracking of how website changes and technical optimizations impact performance.

  • You can view analytics for queries, clicks, and impressions in Search Console to see how they are affected by fixes for issues like site speed or indexing.

  • See if new structured data leads to increased CTR.

  • Compare before/after analytics to measure impact of URL migrations or indexing expansions.

  • Enabling Auto-Tagging in Analytics automatically correlates Analytics and Search Console data.

  • Adding Search Console as an Analytics property tracks KPIs tied to technical factors across one unified platform.

  • Connecting the data streams gives crucial visibility into how technical SEO improvements translate into bottom-line results.

This enables optimizing elements like crawl efficiency for maximum search visibility and traffic gains.

Keep XML sitemaps updated

XML sitemaps provide search engines with a list of pages on your site to focus crawling on.

However, they quickly become outdated as site content evolves.

  • To maintain effectiveness, sitemaps must be updated regularly to reflect new, modified and removed pages.

  • Regenerate sitemaps at least monthly, or ideally weekly for large or frequently changing sites.

  • Re-ping search engines when submitting refreshed sitemap files.

  • Follow proper sitemap guidelines for formatting, size and URL restrictions.

  • Failing to update sitemaps results in broken links, inefficient crawling of new content, and incomplete indexing.

  • Updating ensures that search engines have the latest roadmap to access and properly crawl your site's current architecture.

This helps maximize inclusion of new content in search indexes which can improve rankings for emerging topics and keywords.

Re-submit sitemaps as needed

Simply updating your XML sitemaps is not enough - search engines must be notified of changes to recrawl the URLs.

  • When publishing refreshed sitemaps, resubmit them to Google through Google Search Console.

  • Click Index > Sitemaps and submit the new sitemap files. You can also Ping Google after sitemap updates.

  • This prompts search bots to re-crawl the submitted URLs.

  • Verify in Search Console that new sitemaps are being processed without errors.

  • For significant site changes like migrations impacting many URLs, also consider requesting a supplemental index.

  • Re-submitting sitemaps ensures search engines are accessing the current versions.

  • This prevents dead links and facilitates timely crawling of updated content.

Staying on top of proper sitemap re-submission maintains strong technical SEO foundations.

Stay updated on algorithm changes

Google and other search engines periodically release algorithm updates that can impact rankings and visibility.

  • Staying informed on these changes helps diagnose issues and optimize accordingly.

  • Monitor Google webmaster blogs and forums for official announcements.

  • Review SEO community publications reporting on suspected updates.

  • Identify core website metrics like rankings, traffic and click-through rates that are most likely to be affected.

  • Analyze Search Console data for patterns around update timing.

  • Adapt technical factors as needed - for example, focusing more on site speed if page experience emerges as an algorithm factor.

  • Being proactive about algorithm changes allows reversing any negative impacts and leaning into positive optimizations.

Keeping your technical SEO strategy aligned with the latest algorithms ensures visibility keeps pace with how search engines evaluate and rank pages.

Wow, that was intense.

I hope you found this helpful.

Now go outside and get some sunshine.