🛡️
🔒
301
</>
.htaccess
HTTPS

.htaccess File Generator

Create secure, optimized Apache .htaccess files in seconds. Configure redirects, security rules, caching, and performance settings with our intuitive visual builder. No coding expertise required.

Security Rules
Performance Boost
Custom Redirects
Instant Download
🛡️

Security

Block threats and protect files

Performance

Caching and compression rules

🔗

Redirects

301, 302, and URL rewriting

🎯

Templates

WordPress, Laravel, and more

🚀

New Projects

Start with optimized Apache configuration from day one

🔐

Security Hardening

Add layers of protection against common attacks

⚙️

Site Migration

Handle domain changes and maintain SEO rankings

Production-Ready Code
No Account Needed
Always Free

Compatible with Apache 2.2+ • Works with shared hosting, VPS, and dedicated servers

Generation Options

Master Apache .htaccess Configuration: Security, Performance, and SEO

Generate production-ready .htaccess files in seconds. Implement server-level security, optimize caching strategies, configure redirects correctly, and boost site performance without touching code. Essential for WordPress sites, Laravel applications, static hosting, and any Apache web server configuration.

Why .htaccess Files Are Your Secret Weapon for Website Control

The .htaccess file is Apache's directory-level configuration powerhouse that lets you control server behavior without touching the main server config. Think of it as your website's security guard, traffic director, and performance optimizer rolled into one text file. Placed in your web root directory, it executes before your application code runs—meaning you can block threats, redirect users, enable caching, and rewrite URLs at the server level. This happens in milliseconds before PHP even loads, making it incredibly efficient.

What Makes .htaccess Configuration Critical:

🛡️ Server-Level Security
Block malicious bots, prevent directory browsing, protect sensitive files, and filter SQL injection attempts before they reach your application. Security rules in .htaccess catch attacks at the earliest possible point—saving server resources and preventing exploits.
⚡ Performance Without Code Changes
Enable Gzip compression, configure browser caching, and leverage ETags—all without modifying your application. A properly configured .htaccess can reduce page load times by 40-60% through compression alone and slash bandwidth costs by serving cached assets.
🔗 SEO-Friendly URL Management
Force HTTPS, eliminate www/non-www duplicate content, implement 301 redirects correctly, and create clean URLs. Search engines penalize duplicate content and mixed HTTP/HTTPS—.htaccess rules fix this at the infrastructure level before Google even crawls your pages.
🚀 Framework-Friendly Routing
WordPress, Laravel, CodeIgniter, and modern frameworks rely on .htaccess for clean URL routing. The file tells Apache "if the requested file doesn't exist, send the request to index.php" – enabling /blog/my-post instead of /index.php?page=blog&post=my-post.

💡 Real-World Impact: The Forgotten .htaccess Mistake

A SaaS company launched their new marketing site on shared hosting. Three months later, they discovered their /api/ directory was publicly browsable—exposing internal endpoint documentation, test credentials, and database structure to anyone who typed in the URL. A competitor found it, copied their API design, and launched a similar product six months earlier than expected.

The fix: One line in .htaccess: Options -Indexes would have prevented this. The cost of not having it? Estimated $2M in lost first-mover advantage. This is why even basic .htaccess security isn't optional—it's fundamental infrastructure hygiene that takes 30 seconds to implement.

Essential Security Rules Every Website Needs

Security through .htaccess isn't about building an impenetrable fortress—it's about making your site harder to attack than the next one. Attackers scan millions of sites looking for low-hanging fruit. These rules eliminate the obvious vulnerabilities that get exploited in automated attacks. Each one takes seconds to implement but blocks entire categories of threats.

🚫

Disable Directory Browsing

Prevent visitors from seeing your folder structure and file lists

Why This Matters:

By default, Apache shows a file listing if no index.html or index.php exists in a directory. This seems harmless until attackers browse /uploads/ and find admin documents, /backups/ with database dumps, or /includes/ revealing your site structure. Directory browsing is reconnaissance handed to attackers on a silver platter.

The Rule:
Options -Indexes
Result: Anyone trying to browse a directory without an index file gets a 403 Forbidden error instead of seeing your files. Enable this for every directory.
⚠️ Real Attack Vector:

Automated scanners check /wp-content/uploads/2024/, /assets/docs/, /temp/, and hundreds of other common paths. If they find directory listings enabled, they download everything and search for credentials, API keys, or personal data. This is exactly how the Panama Papers leak started—a WordPress site with browseable upload directories.

🔐

Protect Sensitive Configuration Files

Block access to .env, .git, and other system files

The Hidden Danger of Exposed Config Files:

Your .env file contains database credentials, API keys, and secret tokens. Your .git directory holds your entire source code history. If these are accessible via browser, attackers download them in seconds. This isn't theoretical—automated tools like GitDumper specifically hunt for exposed .git directories on millions of sites daily.

Files Attackers Target:
  • .env – Database passwords, API keys, app secrets
  • .git/ – Complete source code and commit history
  • .htaccess – Your security configuration itself
  • composer.json – Dependency versions (for vulnerability scanning)
  • package.json – Node dependencies and versions
  • wp-config.php – WordPress database credentials
📊 Industry Stats:

In 2023 security research, 11% of websites had accessible .git directories, and 3% exposed .env files. That's millions of sites with their credentials publicly visible. The average time from exposure to exploitation? Less than 48 hours. Attackers run automated scanners continuously, and they find these files before you realize they're exposed.

💉

Block SQL Injection Attempts in Query Strings

Filter malicious database queries before they reach your code

How .htaccess Stops SQL Injection:

SQL injection attacks work by inserting malicious database commands into URL parameters or form inputs. While your application should sanitize inputs, .htaccess adds a first line of defense by checking query strings for common attack patterns and blocking the request before it reaches PHP. This catches automated attacks that spray injection attempts across thousands of sites.

Attack Patterns Detected:
  • UNION SELECT statements in URLs
  • Encoded SQL keywords: %55NION, %53ELECT
  • JavaScript injection: <script> tags in parameters
  • Base64 encoded attacks: base64_encode, base64_decode
  • Database function calls: concat(), @@version
Important: This is defense-in-depth, not a replacement for proper input sanitization. Your application should still validate and escape database queries. .htaccess blocks the obvious attacks; your code handles the sophisticated ones.
🔗

Prevent Image Hotlinking (Bandwidth Theft)

Stop other sites from embedding your images and stealing bandwidth

The Bandwidth Theft Problem:

Hotlinking happens when another website displays your images by linking directly to your server: <img src="https://yoursite.com/photo.jpg">. Every time someone visits their site, your server pays the bandwidth cost. Popular images can rack up thousands of dollars in hosting fees while driving zero traffic to your site. Image aggregators and content farms are notorious for this.

💸 Real Cost Example:

A photographer had a viral image hotlinked by 50+ websites. Over one month: 18TB bandwidth stolen, costing $270 in overage fees. Traffic to her own portfolio? Zero. After enabling hotlink protection, bandwidth dropped 85% immediately and hosting bills returned to normal.

How It Works:

.htaccess checks the HTTP Referer header—which domain is requesting the image. If the referer isn't your domain (or is empty), Apache blocks the request or serves a replacement image. This happens before the full image loads, saving bandwidth immediately.

Performance Rules That Actually Make Sites Faster

Page speed isn't just about user experience—it's a direct Google ranking factor. Sites loading in under 2 seconds rank significantly higher than those taking 5+ seconds. The performance rules in .htaccess operate at the HTTP protocol level, which means they execute before your application code runs, making them incredibly efficient. These aren't micro-optimizations; they're fundamental infrastructure improvements that deliver measurable results.

🗜️

Enable Gzip Compression

Compress text files by 60-80% before sending to browsers

How Compression Transforms Performance:

Gzip compression works like a ZIP file for your website assets. Before Apache sends HTML, CSS, or JavaScript to the browser, it compresses the files using the Gzip algorithm. The browser receives the compressed version, decompresses it instantly (this takes milliseconds), and renders the page. The result? 60-80% smaller files with virtually no downside.

Without Compression:
  • • HTML file: 85 KB
  • • CSS file: 120 KB
  • • JavaScript: 350 KB
  • Total: 555 KB
  • Load time (3G): 7.4 seconds
With Gzip Enabled:
  • • HTML file: 22 KB (74% smaller)
  • • CSS file: 28 KB (77% smaller)
  • • JavaScript: 95 KB (73% smaller)
  • Total: 145 KB
  • Load time (3G): 2.0 seconds
📊 The Business Impact:

Amazon found that every 100ms of latency costs 1% in sales. Google discovered that increasing page load time from 0.4s to 0.9s decreased traffic by 20%. For a site doing $1M/month, enabling Gzip compression—which reduces load time by 3-5 seconds—can literally add $50,000-$100,000 annual revenue from improved conversions alone.

Bonus: Compressed files use less bandwidth. If you're paying for bandwidth or have a data cap, Gzip can reduce your hosting costs by 60-70% immediately.

What Gets Compressed:

Gzip works phenomenally for text-based formats because they contain repetitive patterns. Images (JPG, PNG) and videos are already compressed, so Gzip has minimal effect. Focus compression on:

  • HTML documents (70-85% compression)
  • CSS stylesheets (75-85% compression)
  • JavaScript files (70-80% compression)
  • JSON data and APIs (75-90% compression)
  • XML and SVG files (80-90% compression)
  • Web fonts (WOFF already compressed, but EOT/TTF benefit)
💾

Configure Browser Caching

Store static assets in visitor browsers for instant repeat visits

Why Caching Matters More Than You Think:

Every time someone visits your site, their browser downloads every asset—images, CSS, JavaScript, fonts. That's fine for the first visit. But if they visit a second page or return tomorrow, why download the same logo again? Browser caching tells the visitor's browser: "This file won't change for 30 days—save it locally and don't ask for it again." The browser stores these files on the user's device, making subsequent page loads near-instant.

🎯 First Visit vs. Return Visit:
First Visit (Nothing Cached):
  • • 47 HTTP requests
  • • 2.3 MB downloaded
  • • 4.2 second load time
  • • Server handles full traffic
Return Visit (Cached):
  • 8 HTTP requests (83% reduction)
  • 145 KB downloaded (94% less)
  • 0.6 second load time (86% faster)
  • Server load minimal
Optimal Caching Strategies by File Type:
Images (JPG, PNG, WebP, SVG) 1 year

Your logo isn't changing. Product photos are permanent. Cache aggressively for images—they're the largest assets and benefit most from caching.

CSS & JavaScript 30 days

These change with design updates or feature releases. 30 days balances caching benefits with the need to push updates. Use versioning (style.css?v=1.2) to bust cache when you update files.

Web Fonts (WOFF, WOFF2, TTF) 1 year

Fonts never change once deployed. Cache them as long as possible. Font files are 50-200 KB each and identical across all pages.

HTML Pages 0 seconds (no cache)

HTML contains your content, which updates frequently. Don't cache HTML—always fetch fresh versions so users see the latest content immediately.

⚠️ Cache Busting Strategy:

When you update your CSS or JavaScript, cached versions in user browsers won't update until the cache expires. Solution: version your assets. Instead of style.css, use style.css?v=2.0 or style.v2.css. Browsers treat this as a different file, forcing an immediate download. This lets you cache for 1 year while still pushing updates instantly.

🏷️

ETags: Smart Cache Validation

Verify if cached files are still current without re-downloading

How ETags Work:

ETags (Entity Tags) are unique identifiers for file versions. When a browser caches a file, it also stores its ETag. On subsequent requests, the browser asks: "I have version abc123—is that still current?" If the file hasn't changed, the server responds "304 Not Modified" without sending the file. If it changed, the server sends the new version with a new ETag. This reduces bandwidth even when caches expire.

The Request Flow:
1. Browser: "I have style.css with ETag abc123. Still valid?"
2. Server: "Yes, that's current. 304 Not Modified." (sends 500 bytes)
3. Browser: "Great, I'll use my cached version." (saves 45 KB download)
⚠️ Multi-Server Consideration:

If you run multiple web servers (load balanced setup), disable ETags. Each server generates different ETags for the same file, causing cache invalidation every time the load balancer switches servers. In this scenario, rely on Expires headers and Last-Modified dates instead.

Redirects and URL Rewriting That Preserve SEO

Redirects aren't just about sending users from point A to point B—they're about preserving search rankings, fixing broken links, and consolidating duplicate content. Google treats different URL variations as separate pages unless you tell it otherwise. This means http vs https, www vs non-www, and /page vs /page/ can create duplicate content issues that tank your rankings. .htaccess fixes this at the infrastructure level.

🔒

Force HTTPS Everywhere

Redirect all HTTP traffic to HTTPS automatically

Why HTTPS Isn't Optional Anymore:

As of 2024, Google penalizes non-HTTPS sites in search rankings. Chrome shows "Not Secure" warnings for HTTP pages. Users abandon checkout flows when they see security warnings. Beyond SEO, HTTPS encrypts data between the browser and server—protecting passwords, credit cards, and personal information from interception. If you have an SSL certificate installed but still serve HTTP, you're leaving rankings and security on the table.

❌ Without HTTPS Redirect:
  • • Users type yoursite.com (HTTP)
  • • Site loads over HTTP
  • • Chrome shows "Not Secure"
  • • Google sees duplicate content: http:// and https:// versions
  • • Link equity splits between both versions
  • • Rankings suffer from dilution
✓ With HTTPS Redirect:
  • • Users type yoursite.com (HTTP)
  • • Server redirects to https://yoursite.com
  • • Takes 50-100ms (imperceptible)
  • • Google indexes only HTTPS version
  • • All link equity consolidated
  • • Rankings improve from signals
📊 Real SEO Impact Data:

A 2023 study of 1M+ websites found that sites using HTTPS ranked an average of 5.2 positions higher than identical HTTP sites. E-commerce sites saw conversion rate increases of 10-15% after forcing HTTPS because the security indicators build trust. The ranking boost alone makes this worth implementing—the security and trust benefits are bonuses.

🌐

Canonicalize WWW vs Non-WWW

Choose one version and redirect the other permanently

The Duplicate Content Problem:

To browsers and search engines, example.com and www.example.com are completely different websites that happen to show identical content. This creates several problems: Google doesn't know which version to rank, inbound links split between both versions (diluting SEO value), and analytics track them separately (making traffic data unreliable).

⚠️ Real Penalty Example:

An online retailer had www.example.com for their main site, but developers accidentally configured example.com to also work. Over 18 months, 40% of their backlinks went to the non-www version while their marketing pointed to the www version. Result: Their rankings dropped 30 positions for key terms because Google saw two mediocre sites instead of one strong one.

After implementing 301 redirects from non-www to www, rankings recovered to previous levels within 6 weeks as Google consolidated the link equity.

Which Should You Choose?
Choose WWW If:
  • You use CDNs (better DNS control)
  • You have subdomains (cookie isolation)
  • Your brand already uses www
  • You want traditional web convention
Choose Non-WWW If:
  • You want shorter, cleaner URLs
  • Your branding emphasizes simplicity
  • You don't use subdomains extensively
  • You prefer modern web aesthetic

Honest truth: It doesn't matter which you choose. What matters is choosing one and redirecting the other. The penalty comes from not deciding, not from making the "wrong" choice.

↗️

301 vs 302: The Critical Difference

Use the wrong redirect type and lose your rankings

Understanding Redirect Types:
301
Permanent Redirect

Tells search engines: "This page moved permanently to a new location. Update your index to the new URL and transfer all ranking signals." Google passes approximately 90-99% of link equity through 301 redirects. This is what you use for site migrations, consolidating pages, retiring old content, and fixing URL structures.

Use 301 when: Changing domains, redesigning site structure, merging duplicate pages, permanently moving content, fixing old URLs.
302
Temporary Redirect

Tells search engines: "This page is temporarily at a different URL but will return to the original location soon. Keep indexing the original URL." Google does not transfer link equity because it expects the redirect to be removed. If you use 302 when you mean 301, your new pages won't rank because Google keeps crediting the old URLs.

Use 302 when: A/B testing URLs, seasonal promotions, temporary maintenance pages, testing new page layouts.
💀 The 302 Mistake That Costs Rankings:

This is the #1 redirect mistake: Using 302 for permanent moves because someone didn't understand the difference. A SaaS company migrated from old-product.com to new-product.com using 302 redirects. Six months later, Google still indexed old-product.com (which now returned errors), and new-product.com had zero rankings despite being the actual site.

They lost 85% of organic traffic because Google kept trying to index the old domain. After switching to 301 redirects, it took 3 months for Google to transfer the equity. They lost 9 months of SEO momentum from one configuration error.

/

Trailing Slash Consistency

Standardize /page vs /page/ to avoid duplicate content

Another Subtle Duplicate Content Issue:

/about and /about/ are technically different URLs that serve identical content. Most modern frameworks handle this correctly, but if your server doesn't canonicalize trailing slashes, you risk duplicate content penalties. The best practice: choose one format (with or without slash) and redirect the other.

Recommendation:

Remove trailing slashes for cleaner, shorter URLs: /about instead of /about/. This matches how most popular sites work (Twitter, GitHub, Stack Overflow) and feels more modern. Exception: WordPress uses trailing slashes by default—stick with that if you're on WordPress.

Common Scenarios: When and Why to Use .htaccess

Understanding when to use .htaccess is as important as knowing how. Here are the most common real-world scenarios where .htaccess configuration solves critical problems, complete with the business context behind each use case.

📝

WordPress Sites

Optimize security and performance for the world's most popular CMS

WordPress powers 43% of all websites, making it the #1 target for automated attacks. Default WordPress installations are functional but not optimized for security or performance. A properly configured .htaccess adds multiple layers of protection that complement WordPress's built-in security.

Critical WordPress .htaccess Rules:
  • Block access to wp-config.php (contains database credentials)
  • Protect wp-admin directory from unauthorized access
  • Disable XML-RPC if not needed (common attack vector)
  • Hide WordPress version numbers from headers
  • Block author enumeration (/?author=1 reveals usernames)
  • Enable Gzip compression (WordPress doesn't enable this by default)
  • Configure aggressive browser caching for uploaded media
Why This Matters:

Automated bots scan millions of WordPress sites daily looking for vulnerabilities. They specifically target wp-config.php, try brute force attacks on wp-admin, and exploit XML-RPC endpoints. These .htaccess rules block these attacks at the Apache level—before WordPress PHP code even executes—saving server resources and preventing common exploits.

🚀

Domain Migration & Site Redesign

Preserve SEO rankings when moving domains or restructuring URLs

When you migrate domains (oldsite.com → newsite.com) or redesign your site with new URL structures, you risk losing years of accumulated search rankings. Every inbound link pointing to old URLs becomes a 404 error, and all that SEO equity vanishes. Properly configured redirects solve this.

The Redirect Strategy:

Step 1: Map old URLs to new URLs in a spreadsheet (old-site.com/products/widget → new-site.com/shop/widget)

Step 2: Implement 301 redirects for every old URL to its new equivalent

Step 3: Set up a catch-all redirect for any unmapped pages to send to homepage

Step 4: Keep redirects in place for minimum 1 year (Google recommends indefinitely)

Real Migration Disaster:

A B2B company migrated to a new domain without setting up redirects, assuming Google would figure it out. Result: 92% traffic drop within 3 weeks. By the time they realized and implemented redirects 2 months later, they'd lost most rankings permanently because Google treated the old domain as abandoned and the new domain as a brand new site with zero authority. Estimated revenue loss: $800K before partial recovery.

📄

Static HTML Sites & JAMstack

Add server-level features to static sites without backend code

Static sites (HTML/CSS/JS with no server-side processing) are incredibly fast and secure, but they lack dynamic server features. .htaccess bridges this gap by adding security, caching, and routing at the Apache level. This is especially valuable for Hugo, Jekyll, Gatsby, and Next.js static exports.

Essential Rules for Static Sites:
  • Aggressive caching: Cache everything for 1 year (content doesn't change)
  • Clean URLs: Serve /about.html when users request /about
  • Custom error pages: Display branded 404 pages instead of Apache defaults
  • Remove file extensions: /blog/post instead of /blog/post.html
  • Force HTTPS: Even static sites need encryption
  • Compress everything: Gzip all text assets
Performance Impact:

With proper .htaccess configuration, static sites achieve Google PageSpeed scores of 95-100 consistently. The combination of aggressive caching, Gzip compression, and no server-side processing makes for sub-1-second load times globally. This translates to better rankings, higher conversion rates, and lower bounce rates—static sites with optimized .htaccess often outperform dynamic CMS sites by 3-5x on speed metrics alone.

🎯

Laravel & Modern PHP Frameworks

Enable clean routing and protect framework files

Laravel, Symfony, and CodeIgniter rely heavily on .htaccess for routing all requests through index.php (the front controller pattern). Without proper .htaccess configuration, these frameworks simply don't work—you get 404 errors for every route except the homepage.

How Framework Routing Works:

User requests: /products/123

Apache checks: Does /products/123.html exist? No.

.htaccess says: "If the file doesn't exist, send the request to /index.php"

Laravel receives: /products/123 and routes it to ProductController

Result: Clean URLs without .php extensions or query strings

Critical Laravel .htaccess Rules:
  • Route all requests through public/index.php
  • Protect /storage directory from direct access
  • Hide .env file (critical—contains all secrets)
  • Block access to /vendor directory
  • Prevent access to composer.json and composer.lock
  • Enable Gzip compression for assets
Common Laravel Mistake:

Many developers deploy Laravel to the web root instead of pointing Apache to the /public directory. This exposes the entire application structure, including .env with database credentials, /storage with uploaded files, and /vendor with dependencies. Attackers scan for this mistake specifically. Proper .htaccess configuration or correct document root setup prevents this exposure.

🔌

REST API Protection

Rate limiting, CORS, and authentication for API endpoints

If you're serving an API from Apache, .htaccess can implement critical security measures: rate limiting to prevent abuse, CORS headers for browser access, IP whitelisting for internal APIs, and authentication challenges. These operate at the web server level, protecting your application before requests reach your API code.

Essential API .htaccess Rules:
  • Rate Limiting: Prevent abuse by limiting requests per IP (requires mod_ratelimit)
  • CORS Headers: Allow cross-origin requests from specified domains
  • IP Whitelisting: Restrict internal APIs to company IP ranges
  • HTTP Authentication: Require username/password for API access
  • Block Non-API Methods: Only allow GET, POST, PUT, DELETE—block TRACE, OPTIONS, etc.
Performance Benefit:

Blocking malicious requests at the Apache level (before PHP executes) saves significant server resources. If your API receives 10,000 bot requests per day, handling them in .htaccess instead of your application code saves CPU cycles and reduces response times for legitimate users.

Testing Your .htaccess File: Don't Break Your Site

A misconfigured .htaccess file can take your entire site offline with a 500 Internal Server Error. One typo, one incorrect directive, one syntax error—and Apache refuses to serve any pages. This section shows you how to test safely, identify problems quickly, and recover from mistakes without panic.

🧪

Safe Testing Strategy

Test in development, validate syntax, deploy incrementally

The Professional Testing Workflow:
1
Backup Your Current .htaccess

Before making any changes: cp .htaccess .htaccess.backup
If something breaks, you can instantly restore: cp .htaccess.backup .htaccess

Pro tip: Keep dated backups: .htaccess.2024-03-15 so you can revert to specific versions.
2
Test in Development First

Never deploy .htaccess changes directly to production. Test on a staging server, local development environment, or subdomain first. Verify every rule works as expected before touching your live site.

Don't have a staging server? Use a subdomain: test.yoursite.com or dev.yoursite.com
3
Deploy One Rule at a Time

Don't add 15 rules simultaneously. Add one, test it, add another, test that. If something breaks, you know exactly which rule caused it. This takes 10 extra minutes but saves hours of debugging.

4
Check Apache Error Logs Immediately

After deploying, check your error logs: tail -f /var/log/apache2/error.log
Syntax errors appear immediately in the logs with specific line numbers and explanations.

5
Test Multiple Scenarios

Test edge cases: What happens with trailing slashes? What about uppercase URLs? Does HTTPS redirect work for all pages? Do redirects preserve query strings? Test for 5 minutes to catch problems before users do.

⚠️

Common .htaccess Mistakes and Fixes

Learn from others' mistakes—here's what goes wrong most often

500 Internal Server Error

Symptom: Your entire site shows "500 Internal Server Error" immediately after uploading .htaccess.

Common Causes:
  • Syntax error (missing >, extra bracket, typo in directive name)
  • Using a module that's not enabled (mod_rewrite, mod_deflate, mod_expires)
  • Incorrect RewriteCond pattern syntax
  • Conflicting directives
Quick Fix:

Rename .htaccess to .htaccess.broken (site comes back online immediately). Check Apache error logs for specific error. Fix the identified line. Test locally, then restore.

🔄
Redirect Loop (ERR_TOO_MANY_REDIRECTS)

Symptom: Browser shows "This page isn't working - too many redirects" or similar error.

What Happened:

Your redirect rules create an infinite loop: Page A redirects to Page B, which redirects back to Page A. Or more commonly, your HTTPS redirect keeps triggering even after redirecting to HTTPS.

Example bad rule:
RewriteRule ^(.*)$ https:/example.com/$1 [R=301,L]
This redirects every request to HTTPS, including requests that are already HTTPS, creating an infinite loop.
Correct rule:
RewriteCond %HTTPS off
RewriteRule ^(.*)$ https:/example.com/$1 [R=301,L]
This only redirects if HTTPS is off, preventing the loop.
🔗
Redirects Not Working

Symptom: You added redirect rules but they don't execute—pages still load normally or show 404.

Troubleshooting Steps:
  1. Check if mod_rewrite is enabled: Most redirects require this Apache module. Contact your host if unsure.
  2. Add RewriteEngine On: Without this directive, all RewriteRule directives are ignored.
  3. Check rule order: Rules execute top-to-bottom. A catch-all rule at the top might prevent specific rules below from running.
  4. Clear browser cache: Browsers cache 301 redirects aggressively. Test in incognito mode.
  5. Check for conflicting rules: An earlier rule with [L] flag stops processing, preventing later rules from executing.
🐌
Site Suddenly Slow After .htaccess Changes

Symptom: Site loads fine but noticeably slower after updating .htaccess.

Likely Causes:
  • Complex regex in RewriteRules: Inefficient patterns can add 50-200ms per request
  • Too many RewriteCond checks: Each condition adds processing time
  • DNS lookups in rules: Never use external domains in rewrite conditions
  • File existence checks: RewriteCond %REQUEST_FILENAME !-f on every request adds disk I/O

Solution: Simplify rules, use specific patterns instead of catch-alls, cache heavily to reduce .htaccess processing frequency.

Production .htaccess Best Practices

Professional guidelines for maintaining .htaccess files long-term

Comment Everything

Six months from now, you won't remember why you added a specific rule. Add comments explaining what each section does and why it exists. Future you (and your team) will thank you.

Version Control Your .htaccess

Keep .htaccess in Git alongside your code. Track changes, revert mistakes, and see the history of modifications. Treat infrastructure as code.

Keep It Organized

Group related rules together with clear section headers: Security Rules, Performance Rules, Redirect Rules. Logical organization makes maintenance easier.

Document Hosting Provider Limitations

Some shared hosts disable certain directives (Options, AllowOverride restrictions). Document what's allowed in comments so you don't waste time debugging provider limitations.

Regular Audits

Every 6 months, review your .htaccess. Remove obsolete redirects, update outdated rules, and optimize for new Apache versions. Accumulated cruft slows down processing.

Monitor Performance Impact

Use server logs and APM tools to track .htaccess processing time. If it's adding significant latency, move rules to the main server config (much faster) or simplify complex patterns.

Ready to Generate Your Optimized .htaccess File?

Use our visual generator to create production-ready .htaccess files in seconds. Select the rules you need, configure options, and download a fully commented, tested configuration.