Emmett Miller
Emmett Miller, Co-Founder

How to Create SEO-Friendly URLs: A Complete Guide (2026)

May 13, 2026
Share:
Abstract diagram showing SEO-friendly URL structure with keyword slug and subdirectory hierarchy

TL;DR: An SEO-friendly URL includes the target keyword, separates words with hyphens, keeps the slug under 75 characters, and sits in a logical subdirectory. Use HTTPS. Avoid dynamic parameters where possible. Don't change a URL that already has traffic or backlinks unless you're prepared for a 6-12 month recovery.

How to Create SEO-Friendly URLs: A Complete Guide (2026)

Last updated: May 2026

URLs look like a small detail. They aren't. A poorly structured URL system affects how Google crawls your site, how link equity flows between pages, and how likely your pages are to get indexed at all. Most guides cover the surface: keyword in the URL, hyphens not underscores. This one goes deeper into site architecture, crawl depth, and when changing a URL is a mistake you'll spend months recovering from.

What Makes a URL SEO-Friendly?

An SEO-friendly URL is clean, descriptive, and contains the page's target keyword. When a user or search engine bot reads it, they immediately understand what the page covers. When it's part of a well-planned site structure, it also signals how important that page is relative to everything else on the site.

Here is what the difference looks like in practice:

  • Good: example.com/blog/cold-email-templates
  • Bad: example.com/p?id=8923&cat=7&ref=newsletter
  • Bad: example.com/2026/05/13/the-ultimate-guide-to-writing-cold-email-templates-for-b2b-startups

A clean URL passes one test: a stranger who reads it can guess what the page covers. Beyond that, there are eight specific decisions that determine whether your URLs help or hurt your rankings.

Include Your Target Keyword in the URL

Google uses the words in a URL as a light ranking signal. Before a bot processes your page content, it reads the URL and uses it to form an initial understanding of what the page covers. That means a URL slug like cold-email-templates gives Google a head start on matching your page to the right queries.

The effect is modest compared to content quality or backlinks, but it's real. And it has a secondary impact that often matters more: keywords in URLs improve click-through rates in search results. When a searcher sees a URL that contains the words they just typed, it confirms they're in the right place. That increases clicks even before Google's ranking algorithm gets involved.

The practical rule:

Put the target keyword in the slug, strip stop words and modifiers, and stop there.

Good:

  • /blog/cold-email-templates (keyword is "cold email templates")
  • /blog/b2b-lead-generation (keyword is "b2b lead generation")
  • /blog/seo-friendly-urls (keyword is "seo friendly urls")

Not good:

  • /blog/the-ultimate-10-best-cold-email-templates-for-b2b-saas-startups-in-2026 (bloated)
  • /blog/post-1 (no keywords)
  • /blog/cold-email-cold-email-templates-best-cold-email (stuffed)

Don't copy the full article title into the slug. Remove articles ("the", "a"), prepositions ("for", "with"), and year modifiers unless they are actually part of the query people search. If someone searches "cold email templates," your slug is /cold-email-templates, not /cold-email-templates-2026.

When to match a competitor's URL slug:

If a competitor ranks in the top 10 positions for your target keyword and their slug is different from what you'd derive from the keyword, consider matching theirs. Search engines have already associated that exact slug with the query. Going head-on with a better article at a matching URL pattern gives your page a faster path to relevance.

Strip any CMS-added numeric IDs first: competitor.com/blog/seo-friendly-urls-79847 becomes your-site.com/blog/seo-friendly-urls.

Use Hyphens to Separate Words, Not Underscores

This rule has no gray area. Google treats hyphens and underscores differently when it reads URLs, and the difference has a direct impact on keyword recognition.

Hyphen: a word separator. cold-email is read as two words: "cold" and "email."

Underscore: a word connector. cold_email is read as one word: "coldemail."

When you use underscores in a URL slug, the individual keywords don't register as separate words. A page at example.com/cold_email_templates contains the keyword string "coldemailtemples" from Google's perspective, not the three separate keywords "cold," "email," and "templates."

Google has explicitly recommended hyphens over underscores in its developer documentation. John Mueller and Matt Cutts both addressed this in YouTube videos and developer Q&As. This isn't a contested SEO opinion. it's a stated Google preference.

What else to avoid:

  • Spaces. a space in a URL gets encoded as %20, producing slugs like cold%20email%20templates. Technically valid, but it looks suspicious to users and harder for bots to parse cleanly.
  • Plus signs. sometimes used in query strings to replace spaces, but not appropriate in clean URL slugs.
  • Special characters. @, #, &, ?, ! all have special meanings in URL syntax and can break routing or create unintended parameter strings.
  • Uppercase letters. example.com/Cold-Email-Templates and example.com/cold-email-templates are technically different URLs. Mixing cases can create redirect loops or duplicate pages. Lowercase throughout is the safe default.

The rule is simple: lowercase, hyphens only, no special characters.

Run SEO on autopilot.

Miniloop handles keyword research, briefs, drafts, and rank tracking. With Ahrefs, Semrush, your CMS. On a schedule.

See SEO automation

Keep Your URL Slug Under 75 Characters

URL length advice online is often confused because it conflates two different things: full URL length and slug length. They're not the same problem.

Full URL length. including the protocol, domain, subdirectories, and slug. is not a meaningful SEO concern. John Mueller has addressed this directly: Google handles URLs with hundreds or even thousands of characters without issue. You're unlikely to accidentally build a URL that breaks anything.

Slug length. the segment after the final /. is what actually matters for SEO. Most major SEO tools flag slugs over 75 characters. RankMath, Yoast, and SEMrush all use this threshold. The reason isn't arbitrary: shorter slugs correlate with higher rankings in third-party studies, and there are at least two concrete mechanisms behind that correlation.

First, when Google has to choose a canonical version among near-identical pages, it tends to prefer the shorter URL. If you have a page with a clean slug and a duplicate with a longer one, the shorter version is more likely to be selected as the primary.

Second, shorter URLs are easier to copy, share, and use as anchor text in backlinks. A 90-character slug that spills across two lines in a tweet or email gets truncated or mangled. A short slug survives intact.

The practical rule:

Your slug should be the target keyword and nothing else.

  • Strip the year if you plan to update the article over time: /cold-email-templates, not /cold-email-templates-2026.
  • Strip stop words that aren't part of the actual keyword: /email-list-building, not /how-to-build-an-email-list.
  • Don't add category prefixes that repeat information the subdirectory already provides: /blog/cold-email-templates, not /blog/blog-posts/cold-email-templates.

One exception: if "how-to" is part of the query people are actually searching. like "how to create seo friendly urls". include it in the slug. The test is whether stripping a word makes the slug match the keyword less precisely.

Use Subdirectories to Build Logical Site Structure

Subdirectories are the folder-like segments in a URL: /blog/, /tools/, /solutions/. They tell Google how your site is organized and determine how link equity flows through the hierarchy. Used well, they create hub pages that rank for broad keywords. Used poorly, they add crawl depth without adding value.

The structural approach: grouping by content type

example.com/blog/...
example.com/tools/...
example.com/pricing
example.com/about

This is the default for most sites. It works fine for pages that don't need organic traffic (About, Contact, Pricing). The problem: structural grouping often turns the top-level URL into a generic index page that doesn't rank for anything. A page at /tools that just lists your tools has no topical authority on its own.

The thematic approach: grouping by topic

example.com/ai-sdr/...
example.com/cold-email/...
example.com/seo/...

When you make /ai-sdr a top-level page that covers the AI SDR topic, it becomes a hub page. Every child page under it (/ai-sdr/pricing, /ai-sdr/vs-human-sdr, /ai-sdr/best-tools) inherits link equity from the hub. The hub page ranks for broad searches. The child pages rank for specific ones.

Search Engine Journal's /ranking-factors page is a documented example. That top-level hub has 14,000+ internal links pointing to it across the site and ranks for over 1,000 keywords. A nested child page about a single ranking factor has 48 internal links and ranks for 14 keywords. The URL structure is part of the reason. but the deeper cause is that the architecture made it easy to concentrate authority on the hub.

When categories in URLs help vs. hurt:

Adding a category segment (/blog/seo/url-structure) makes sense when your site covers many unrelated topics and you need to silo authority by theme. For sites under 100 pages, the extra depth usually adds crawl cost without meaningful benefit. When in doubt: keep it shallow.

HTTPS:

HTTPS has been a Google ranking factor since 2014. If you're still on HTTP, address that before worrying about any other URL optimization. The padlock in your browser's address bar is the quick check.

Think About Crawl Depth and Homepage Distance

Every site has a crawl budget. Google allocates bot time based on a site's size, authority, and freshness signals. Larger, more authoritative sites get more. Smaller sites get less. That finite budget means every click of depth from your homepage has a cost.

John Mueller has confirmed directly that pages further from the homepage are treated as less critical by Google's crawlers. The reasoning is straightforward: if a bot starts at your homepage and follows links, pages that are four or five clicks deep get visited less frequently. Google may not crawl them on every cycle. If a page isn't getting crawled consistently, it isn't getting indexed and re-evaluated consistently.

What this means for URL structure:

A page at /seo-tools (one click from the homepage, via the navigation) gets crawled far more often than /resources/blog/category/seo/tool-guides/url-optimization-2026 (five clicks deep). If the content quality is similar, the shallower page wins on crawl frequency alone.

This is partly why hub pages outperform deeply nested child pages. It's not purely the URL structure itself. it's that shallow, hub-level pages accumulate more internal links, get crawled more frequently, and attract more backlinks over time. The URL structure makes those outcomes more likely.

Practical rules:

  1. Keep pages you want to rank at the top level or one level deep.
  2. Link to important pages from the homepage and from every major section of the site. not just from the parent category.
  3. Use your XML sitemap to surface deep pages to Google even if they're several clicks from the homepage.
  4. Monitor indexation in Google Search Console. If important pages aren't being indexed, crawl depth and sparse internal linking are the first two places to investigate.

If you can't change the URL:

Internal links compensate. A page that is six clicks deep but receives 200 internal links from high-traffic pages will still get crawled regularly. Crawl budget allocation responds to both URL structure and link signals. Fixing internal linking is often faster and less risky than restructuring your entire URL hierarchy.

Handle URL Parameters and Dynamic URLs Carefully

URL parameters. the strings that appear after ? in a URL, like ?sort=price&filter=brand&session=abc123. are a normal part of how web applications work. They're also one of the fastest ways to create crawling and indexing problems at scale.

What goes wrong:

Google treats each unique URL as a potentially distinct page. If your ecommerce site has 500 products and supports sorting by price, popularity, and newest. plus filtering by 10 categories. you can generate tens of thousands of distinct URLs pointing to essentially the same content. Google crawls all of them, splits link equity across all of them, and may flag the site for thin or duplicate content.

Session IDs are the most destructive variant. A URL like example.com/product?session=user_a1b2c3xyz creates a unique URL for every user session. Search engine bots generate their own sessions, meaning the URL Google indexes will never match what a real user sees. The result: thousands of indexed URLs, none of them stable, all of them wasting crawl budget.

How to fix parameter issues:

  1. Google Search Console's URL parameters tool. Tell Google which parameters should be ignored during crawling and which alter the page meaningfully. Parameters like ?sort= and ?ref= usually don't change the core content.
  2. Canonical tags. If a parameter variant must exist for app functionality, add <link rel="canonical" href="..."> on the parameter URL pointing to the clean version. This tells Google which URL to index without breaking the app.
  3. Robots.txt exclusions. For parameters that generate zero unique content. tracking codes, session IDs, A/B test variants. block crawling entirely.
  4. Restructure faceted navigation. /sneakers/mens/ is better than /products?category=sneakers&gender=mens. Clean subdirectory URLs are crawlable, linkable, and can rank independently.

The goal: every URL Google indexes should be a page worth ranking. Parameters that only add noise to your index should either redirect to the clean URL or be excluded entirely.

Think Twice Before Changing Existing URLs

If you're reading a URL structure guide while looking at your existing site, the natural reaction is to start cleaning things up. Resist it. at least for pages that are already getting traffic.

Changing a URL. even by a single character. creates a new page in Google's index. The old URL's history: the backlinks pointing to it, the click-through rate signals it has accumulated, the internal linking weight it carries. all of that is tied to the old slug. A redirect passes some of that value to the new URL, but not all of it, and not immediately.

What Google has said:

John Mueller has addressed URL changes repeatedly in developer Q&As and YouTube videos. His consistent message: even with a properly configured 301 redirect in place, recovering prior rankings after a URL change can take six months to a year or more. The redirect preserves link equity in principle, but in practice it's not a clean transfer, and the recovery timeline is unpredictable.

When it's safe to change a URL:

  • The page has been live for less than six months.
  • It has fewer than five backlinks from external sites.
  • It ranks below position 50 for every keyword.
  • The current URL contains a misspelling or is actively confusing to users.

When it is not worth the risk:

  • The page has been live for a year or more.
  • It ranks in the top 20 for any keyword.
  • It has earned backlinks that send referral traffic.
  • You have revenue attributed to organic traffic from that page.

If you must change the URL anyway:

  1. Set up a 301 redirect from the old URL to the new one before making any other change.
  2. Update all internal links that point to the old URL across the entire site.
  3. Submit the new URL to Google Search Console using the URL inspection tool.
  4. Monitor rankings for that page weekly for at least three months and expect a temporary dip.

The best URL decisions happen before a page is published. For existing pages with traffic, the correct decision is almost always to leave the URL alone and improve the content instead.

Automate SEO Content Operations With Miniloop

These guidelines cover the technical side of URL structure. But getting a site to rank involves more than formatting URLs correctly.

The actual execution work: keyword research across hundreds of queries, writing content briefs from SERP analysis, drafting posts that cover competitor topics with more depth, publishing to the CMS, managing internal links at scale, and tracking rank changes week over week. That's not a one-time task. It's recurring, repeatable work that compounds over time.

Miniloop handles that execution. We build and run SEO workflows for startup teams:

  • Keyword research and scoring. pulls target keywords from SEMrush, scores by difficulty, volume, and brand fit, and prioritizes the opportunities worth writing about first
  • SERP-based content briefs. analyzes competitor pages and top-ranking results to produce briefs that tell writers exactly what topics, headings, and entities to cover
  • Blog post drafting. writes full posts from briefs, following your tone guidelines and internal link strategy
  • Publishing to your CMS. pushes finished drafts to WordPress, Sanity, Webflow, or Contentful, formatted and ready to publish
  • Rank monitoring with Slack alerts. watches your target keywords weekly and notifies your team when positions change

Whether you have a content person running the strategy, are building the SEO system yourself as a founder, or are hiring for the function, Miniloop handles the repeatable execution work between knowing what to write and having published content ranking.

Try Miniloop or browse templates.

SEO-Friendly URL Examples

The difference between a problematic URL and a clean one is usually obvious when you see them together. Here are concrete before-and-after examples for the site types most likely to have URL issues.

Blog:

  • Bad: example.com/p?id=8923&post=true
  • Bad: example.com/2026/05/13/the-ultimate-guide-to-cold-email-templates-for-b2b-startups
  • Good: example.com/blog/cold-email-templates

SaaS product pages:

  • Bad: example.com/product?view=pricing&tier=growth&ref=nav
  • Good: example.com/pricing
  • Bad: example.com/features/AI-automation-platform-for-gtm-teams
  • Good: example.com/ai-automation

Ecommerce:

  • Bad: example.com/catalog/category/shoes?gender=mens&type=casual&sort=price_asc
  • Good: example.com/sneakers/mens
  • Bad: example.com/product/view.php?id=2847&cat=14&session=abc123
  • Good: example.com/sneakers/mens/nike-air-force-1

Local business:

  • Bad: example.com/services?location=boise&type=hvac&subtype=ac-repair
  • Good: example.com/idaho/ac-repair

The pattern across all of them:

  1. Clean over complex. remove parameters, dates, and stop words
  2. Keyword first. the core topic appears as early in the slug as possible
  3. Shortest sensible path. don't add subdirectories that don't help users or bots navigate
  4. Human-readable. a stranger who reads the URL can identify what the page covers

The clean examples above are not just easier to look at. They rank better because they accumulate backlinks with the URL as anchor text, display clearly in search results, and are easier for internal linking tools and content teams to reference and manage consistently.

  • Programmatic SEO - Scale SEO traffic with programmatic landing pages
  • Platform - How Miniloop's GTM agent platform works

Frequently Asked Questions

Do keywords in URLs actually help SEO rankings?

Yes, but modestly. Google uses URL words as a light ranking signal to understand what a page covers before processing the full content. The effect is smaller than content quality, backlinks, or page experience signals. Where keywords in URLs matter more noticeably is click-through rate: when a searcher sees their query words reflected in the URL, it confirms relevance before they click. That CTR improvement can outweigh the direct ranking signal in practice.

Should I use hyphens or underscores in URLs?

Always hyphens. Google treats hyphens as word separators and underscores as word connectors. A URL like `cold-email-templates` is read as three separate keywords: cold, email, templates. A URL like `cold_email_templates` is read as one word: coldemailtemples. The individual keywords don't register separately with underscores. Google has stated a preference for hyphens in its developer documentation, and this hasn't changed.

How long should a URL slug be for SEO?

Keep the slug under 75 characters. Most SEO tools (RankMath, Yoast, SEMrush) flag slugs above that threshold. Shorter slugs correlate with higher rankings in research studies, partly because Google tends to prefer the shorter URL when deduplicating near-identical pages. The slug should contain the target keyword and little else. Strip stop words, dates, and category prefixes that the subdirectory already communicates.

Should I change old URLs to make them cleaner?

Not if the page has traffic or backlinks. Changing a URL creates a new page in Google's index, and even with a 301 redirect in place, John Mueller has said ranking recovery can take 6-12 months or more. Only change URLs on pages that have been live under 6 months, rank below position 50 for all keywords, and have fewer than 5 external backlinks. For established pages, the disruption almost always outweighs the benefit of a cleaner slug.

What are URL parameters and why do they hurt SEO?

URL parameters are the key-value pairs after a `?` in a URL, like `?sort=price&filter=red&session=abc`. They let web apps filter or customize content. The SEO problem: Google treats each unique URL as a potentially distinct page. An ecommerce site with 10 filter options and 500 products can generate thousands of near-duplicate indexed pages. Session IDs are worst because they create a unique URL per user, none of which match what a real visitor sees. Fix with canonical tags, robots.txt exclusions, or Google Search Console's URL parameters tool.

Do subdirectories help with SEO?

They help indirectly by enabling better site architecture. Subdirectories let you create hub pages. like `/ai-sdr/` covering the AI SDR topic. that rank for broad keywords and funnel authority to child pages beneath them. Hub pages accumulate more internal links and backlinks than nested child pages, which is why top-level thematic pages consistently outrank deeply nested ones. Adding subdirectories just to seem organized, without creating genuine hub pages that rank for something, adds crawl depth without value.

Related Templates

Automate workflows related to this topic with ready-to-use templates.

View all templates
AhrefsOpenAIGoogle Docs

Generate SEO content briefs with AI and Ahrefs

Turn Ahrefs keyword research into detailed AI-generated content briefs. Automate SEO content planning and save hours per article.

SemrushOpenAISlack

Track competitor SEO rankings with AI insights

Monitor competitor keyword rankings weekly with Semrush and get AI-powered analysis delivered to Slack. Never miss a ranking shift again.

AhrefsOpenAINotion

Generate SEO blog posts automatically with AI and Ahrefs

Turn keywords into full blog posts with AI. Automate research, writing, and SEO optimization. Publish directly to WordPress or Notion.

Related Articles

Explore more insights and guides on automation and AI.

View all articles