.htaccess File Generator
Create secure, optimized Apache .htaccess files in seconds. Configure redirects, security rules, caching, and performance settings with our intuitive visual builder. No coding expertise required.
Security
Block threats and protect files
Performance
Caching and compression rules
Redirects
301, 302, and URL rewriting
Templates
WordPress, Laravel, and more
New Projects
Start with optimized Apache configuration from day one
Security Hardening
Add layers of protection against common attacks
Site Migration
Handle domain changes and maintain SEO rankings
Compatible with Apache 2.2+ • Works with shared hosting, VPS, and dedicated servers
Master Apache .htaccess Configuration: Security, Performance, and SEO
Generate production-ready .htaccess files in seconds. Implement server-level security, optimize caching strategies, configure redirects correctly, and boost site performance without touching code. Essential for WordPress sites, Laravel applications, static hosting, and any Apache web server configuration.
Why .htaccess Files Are Your Secret Weapon for Website Control
The .htaccess file is Apache's directory-level configuration powerhouse that lets you control server behavior without touching the main server config. Think of it as your website's security guard, traffic director, and performance optimizer rolled into one text file. Placed in your web root directory, it executes before your application code runs—meaning you can block threats, redirect users, enable caching, and rewrite URLs at the server level. This happens in milliseconds before PHP even loads, making it incredibly efficient.
What Makes .htaccess Configuration Critical:
💡 Real-World Impact: The Forgotten .htaccess Mistake
A SaaS company launched their new marketing site on shared hosting. Three months later, they discovered their /api/ directory was publicly browsable—exposing internal endpoint documentation, test credentials, and database structure to anyone who typed in the URL. A competitor found it, copied their API design, and launched a similar product six months earlier than expected.
The fix: One line in .htaccess: Options -Indexes would have prevented this. The cost of not having it? Estimated $2M in lost first-mover advantage. This is why even basic .htaccess security isn't optional—it's fundamental infrastructure hygiene that takes 30 seconds to implement.
Essential Security Rules Every Website Needs
Security through .htaccess isn't about building an impenetrable fortress—it's about making your site harder to attack than the next one. Attackers scan millions of sites looking for low-hanging fruit. These rules eliminate the obvious vulnerabilities that get exploited in automated attacks. Each one takes seconds to implement but blocks entire categories of threats.
Disable Directory Browsing
Prevent visitors from seeing your folder structure and file lists
By default, Apache shows a file listing if no index.html or index.php exists in a directory. This seems harmless until attackers browse /uploads/ and find admin documents, /backups/ with database dumps, or /includes/ revealing your site structure. Directory browsing is reconnaissance handed to attackers on a silver platter.
Options -IndexesAutomated scanners check /wp-content/uploads/2024/, /assets/docs/, /temp/, and hundreds of other common paths. If they find directory listings enabled, they download everything and search for credentials, API keys, or personal data. This is exactly how the Panama Papers leak started—a WordPress site with browseable upload directories.
Protect Sensitive Configuration Files
Block access to .env, .git, and other system files
Your .env file contains database credentials, API keys, and secret tokens. Your .git directory holds your entire source code history. If these are accessible via browser, attackers download them in seconds. This isn't theoretical—automated tools like GitDumper specifically hunt for exposed .git directories on millions of sites daily.
.env– Database passwords, API keys, app secrets.git/– Complete source code and commit history.htaccess– Your security configuration itselfcomposer.json– Dependency versions (for vulnerability scanning)package.json– Node dependencies and versionswp-config.php– WordPress database credentials
In 2023 security research, 11% of websites had accessible .git directories, and 3% exposed .env files. That's millions of sites with their credentials publicly visible. The average time from exposure to exploitation? Less than 48 hours. Attackers run automated scanners continuously, and they find these files before you realize they're exposed.
Block SQL Injection Attempts in Query Strings
Filter malicious database queries before they reach your code
SQL injection attacks work by inserting malicious database commands into URL parameters or form inputs. While your application should sanitize inputs, .htaccess adds a first line of defense by checking query strings for common attack patterns and blocking the request before it reaches PHP. This catches automated attacks that spray injection attempts across thousands of sites.
UNION SELECTstatements in URLs- Encoded SQL keywords:
%55NION,%53ELECT - JavaScript injection:
<script>tags in parameters - Base64 encoded attacks:
base64_encode,base64_decode - Database function calls:
concat(),@@version
Prevent Image Hotlinking (Bandwidth Theft)
Stop other sites from embedding your images and stealing bandwidth
Hotlinking happens when another website displays your images by linking directly to your server: <img src="https://yoursite.com/photo.jpg">. Every time someone visits their site, your server pays the bandwidth cost. Popular images can rack up thousands of dollars in hosting fees while driving zero traffic to your site. Image aggregators and content farms are notorious for this.
A photographer had a viral image hotlinked by 50+ websites. Over one month: 18TB bandwidth stolen, costing $270 in overage fees. Traffic to her own portfolio? Zero. After enabling hotlink protection, bandwidth dropped 85% immediately and hosting bills returned to normal.
.htaccess checks the HTTP Referer header—which domain is requesting the image. If the referer isn't your domain (or is empty), Apache blocks the request or serves a replacement image. This happens before the full image loads, saving bandwidth immediately.
Performance Rules That Actually Make Sites Faster
Page speed isn't just about user experience—it's a direct Google ranking factor. Sites loading in under 2 seconds rank significantly higher than those taking 5+ seconds. The performance rules in .htaccess operate at the HTTP protocol level, which means they execute before your application code runs, making them incredibly efficient. These aren't micro-optimizations; they're fundamental infrastructure improvements that deliver measurable results.
Enable Gzip Compression
Compress text files by 60-80% before sending to browsers
Gzip compression works like a ZIP file for your website assets. Before Apache sends HTML, CSS, or JavaScript to the browser, it compresses the files using the Gzip algorithm. The browser receives the compressed version, decompresses it instantly (this takes milliseconds), and renders the page. The result? 60-80% smaller files with virtually no downside.
- • HTML file: 85 KB
- • CSS file: 120 KB
- • JavaScript: 350 KB
- Total: 555 KB
- Load time (3G): 7.4 seconds
- • HTML file: 22 KB (74% smaller)
- • CSS file: 28 KB (77% smaller)
- • JavaScript: 95 KB (73% smaller)
- Total: 145 KB
- Load time (3G): 2.0 seconds
Amazon found that every 100ms of latency costs 1% in sales. Google discovered that increasing page load time from 0.4s to 0.9s decreased traffic by 20%. For a site doing $1M/month, enabling Gzip compression—which reduces load time by 3-5 seconds—can literally add $50,000-$100,000 annual revenue from improved conversions alone.
Bonus: Compressed files use less bandwidth. If you're paying for bandwidth or have a data cap, Gzip can reduce your hosting costs by 60-70% immediately.
Gzip works phenomenally for text-based formats because they contain repetitive patterns. Images (JPG, PNG) and videos are already compressed, so Gzip has minimal effect. Focus compression on:
- HTML documents (70-85% compression)
- CSS stylesheets (75-85% compression)
- JavaScript files (70-80% compression)
- JSON data and APIs (75-90% compression)
- XML and SVG files (80-90% compression)
- Web fonts (WOFF already compressed, but EOT/TTF benefit)
Configure Browser Caching
Store static assets in visitor browsers for instant repeat visits
Every time someone visits your site, their browser downloads every asset—images, CSS, JavaScript, fonts. That's fine for the first visit. But if they visit a second page or return tomorrow, why download the same logo again? Browser caching tells the visitor's browser: "This file won't change for 30 days—save it locally and don't ask for it again." The browser stores these files on the user's device, making subsequent page loads near-instant.
- • 47 HTTP requests
- • 2.3 MB downloaded
- • 4.2 second load time
- • Server handles full traffic
- • 8 HTTP requests (83% reduction)
- • 145 KB downloaded (94% less)
- • 0.6 second load time (86% faster)
- • Server load minimal
Your logo isn't changing. Product photos are permanent. Cache aggressively for images—they're the largest assets and benefit most from caching.
These change with design updates or feature releases. 30 days balances caching benefits with the need to push updates. Use versioning (style.css?v=1.2) to bust cache when you update files.
Fonts never change once deployed. Cache them as long as possible. Font files are 50-200 KB each and identical across all pages.
HTML contains your content, which updates frequently. Don't cache HTML—always fetch fresh versions so users see the latest content immediately.
When you update your CSS or JavaScript, cached versions in user browsers won't update until the cache expires. Solution: version your assets. Instead of style.css, use style.css?v=2.0 or style.v2.css. Browsers treat this as a different file, forcing an immediate download. This lets you cache for 1 year while still pushing updates instantly.
ETags: Smart Cache Validation
Verify if cached files are still current without re-downloading
ETags (Entity Tags) are unique identifiers for file versions. When a browser caches a file, it also stores its ETag. On subsequent requests, the browser asks: "I have version abc123—is that still current?" If the file hasn't changed, the server responds "304 Not Modified" without sending the file. If it changed, the server sends the new version with a new ETag. This reduces bandwidth even when caches expire.
If you run multiple web servers (load balanced setup), disable ETags. Each server generates different ETags for the same file, causing cache invalidation every time the load balancer switches servers. In this scenario, rely on Expires headers and Last-Modified dates instead.
Redirects and URL Rewriting That Preserve SEO
Redirects aren't just about sending users from point A to point B—they're about preserving search rankings, fixing broken links, and consolidating duplicate content. Google treats different URL variations as separate pages unless you tell it otherwise. This means http vs https, www vs non-www, and /page vs /page/ can create duplicate content issues that tank your rankings. .htaccess fixes this at the infrastructure level.
Force HTTPS Everywhere
Redirect all HTTP traffic to HTTPS automatically
As of 2024, Google penalizes non-HTTPS sites in search rankings. Chrome shows "Not Secure" warnings for HTTP pages. Users abandon checkout flows when they see security warnings. Beyond SEO, HTTPS encrypts data between the browser and server—protecting passwords, credit cards, and personal information from interception. If you have an SSL certificate installed but still serve HTTP, you're leaving rankings and security on the table.
- • Users type yoursite.com (HTTP)
- • Site loads over HTTP
- • Chrome shows "Not Secure"
- • Google sees duplicate content: http:// and https:// versions
- • Link equity splits between both versions
- • Rankings suffer from dilution
- • Users type yoursite.com (HTTP)
- • Server redirects to https://yoursite.com
- • Takes 50-100ms (imperceptible)
- • Google indexes only HTTPS version
- • All link equity consolidated
- • Rankings improve from signals
A 2023 study of 1M+ websites found that sites using HTTPS ranked an average of 5.2 positions higher than identical HTTP sites. E-commerce sites saw conversion rate increases of 10-15% after forcing HTTPS because the security indicators build trust. The ranking boost alone makes this worth implementing—the security and trust benefits are bonuses.
Canonicalize WWW vs Non-WWW
Choose one version and redirect the other permanently
To browsers and search engines, example.com and www.example.com are completely different websites that happen to show identical content. This creates several problems: Google doesn't know which version to rank, inbound links split between both versions (diluting SEO value), and analytics track them separately (making traffic data unreliable).
An online retailer had www.example.com for their main site, but developers accidentally configured example.com to also work. Over 18 months, 40% of their backlinks went to the non-www version while their marketing pointed to the www version. Result: Their rankings dropped 30 positions for key terms because Google saw two mediocre sites instead of one strong one.
After implementing 301 redirects from non-www to www, rankings recovered to previous levels within 6 weeks as Google consolidated the link equity.
- You use CDNs (better DNS control)
- You have subdomains (cookie isolation)
- Your brand already uses www
- You want traditional web convention
- You want shorter, cleaner URLs
- Your branding emphasizes simplicity
- You don't use subdomains extensively
- You prefer modern web aesthetic
Honest truth: It doesn't matter which you choose. What matters is choosing one and redirecting the other. The penalty comes from not deciding, not from making the "wrong" choice.
301 vs 302: The Critical Difference
Use the wrong redirect type and lose your rankings
Tells search engines: "This page moved permanently to a new location. Update your index to the new URL and transfer all ranking signals." Google passes approximately 90-99% of link equity through 301 redirects. This is what you use for site migrations, consolidating pages, retiring old content, and fixing URL structures.
Tells search engines: "This page is temporarily at a different URL but will return to the original location soon. Keep indexing the original URL." Google does not transfer link equity because it expects the redirect to be removed. If you use 302 when you mean 301, your new pages won't rank because Google keeps crediting the old URLs.
This is the #1 redirect mistake: Using 302 for permanent moves because someone didn't understand the difference. A SaaS company migrated from old-product.com to new-product.com using 302 redirects. Six months later, Google still indexed old-product.com (which now returned errors), and new-product.com had zero rankings despite being the actual site.
They lost 85% of organic traffic because Google kept trying to index the old domain. After switching to 301 redirects, it took 3 months for Google to transfer the equity. They lost 9 months of SEO momentum from one configuration error.
Trailing Slash Consistency
Standardize /page vs /page/ to avoid duplicate content
/about and /about/ are technically different URLs that serve identical content. Most modern frameworks handle this correctly, but if your server doesn't canonicalize trailing slashes, you risk duplicate content penalties. The best practice: choose one format (with or without slash) and redirect the other.
Remove trailing slashes for cleaner, shorter URLs: /about instead of /about/. This matches how most popular sites work (Twitter, GitHub, Stack Overflow) and feels more modern. Exception: WordPress uses trailing slashes by default—stick with that if you're on WordPress.
Common Scenarios: When and Why to Use .htaccess
Understanding when to use .htaccess is as important as knowing how. Here are the most common real-world scenarios where .htaccess configuration solves critical problems, complete with the business context behind each use case.
WordPress Sites
Optimize security and performance for the world's most popular CMS
WordPress powers 43% of all websites, making it the #1 target for automated attacks. Default WordPress installations are functional but not optimized for security or performance. A properly configured .htaccess adds multiple layers of protection that complement WordPress's built-in security.
- Block access to wp-config.php (contains database credentials)
- Protect wp-admin directory from unauthorized access
- Disable XML-RPC if not needed (common attack vector)
- Hide WordPress version numbers from headers
- Block author enumeration (/?author=1 reveals usernames)
- Enable Gzip compression (WordPress doesn't enable this by default)
- Configure aggressive browser caching for uploaded media
Automated bots scan millions of WordPress sites daily looking for vulnerabilities. They specifically target wp-config.php, try brute force attacks on wp-admin, and exploit XML-RPC endpoints. These .htaccess rules block these attacks at the Apache level—before WordPress PHP code even executes—saving server resources and preventing common exploits.
Domain Migration & Site Redesign
Preserve SEO rankings when moving domains or restructuring URLs
When you migrate domains (oldsite.com → newsite.com) or redesign your site with new URL structures, you risk losing years of accumulated search rankings. Every inbound link pointing to old URLs becomes a 404 error, and all that SEO equity vanishes. Properly configured redirects solve this.
Step 1: Map old URLs to new URLs in a spreadsheet (old-site.com/products/widget → new-site.com/shop/widget)
Step 2: Implement 301 redirects for every old URL to its new equivalent
Step 3: Set up a catch-all redirect for any unmapped pages to send to homepage
Step 4: Keep redirects in place for minimum 1 year (Google recommends indefinitely)
A B2B company migrated to a new domain without setting up redirects, assuming Google would figure it out. Result: 92% traffic drop within 3 weeks. By the time they realized and implemented redirects 2 months later, they'd lost most rankings permanently because Google treated the old domain as abandoned and the new domain as a brand new site with zero authority. Estimated revenue loss: $800K before partial recovery.
Static HTML Sites & JAMstack
Add server-level features to static sites without backend code
Static sites (HTML/CSS/JS with no server-side processing) are incredibly fast and secure, but they lack dynamic server features. .htaccess bridges this gap by adding security, caching, and routing at the Apache level. This is especially valuable for Hugo, Jekyll, Gatsby, and Next.js static exports.
- Aggressive caching: Cache everything for 1 year (content doesn't change)
- Clean URLs: Serve /about.html when users request /about
- Custom error pages: Display branded 404 pages instead of Apache defaults
- Remove file extensions: /blog/post instead of /blog/post.html
- Force HTTPS: Even static sites need encryption
- Compress everything: Gzip all text assets
With proper .htaccess configuration, static sites achieve Google PageSpeed scores of 95-100 consistently. The combination of aggressive caching, Gzip compression, and no server-side processing makes for sub-1-second load times globally. This translates to better rankings, higher conversion rates, and lower bounce rates—static sites with optimized .htaccess often outperform dynamic CMS sites by 3-5x on speed metrics alone.
Laravel & Modern PHP Frameworks
Enable clean routing and protect framework files
Laravel, Symfony, and CodeIgniter rely heavily on .htaccess for routing all requests through index.php (the front controller pattern). Without proper .htaccess configuration, these frameworks simply don't work—you get 404 errors for every route except the homepage.
User requests: /products/123
Apache checks: Does /products/123.html exist? No.
.htaccess says: "If the file doesn't exist, send the request to /index.php"
Laravel receives: /products/123 and routes it to ProductController
Result: Clean URLs without .php extensions or query strings
- Route all requests through public/index.php
- Protect /storage directory from direct access
- Hide .env file (critical—contains all secrets)
- Block access to /vendor directory
- Prevent access to composer.json and composer.lock
- Enable Gzip compression for assets
Many developers deploy Laravel to the web root instead of pointing Apache to the /public directory. This exposes the entire application structure, including .env with database credentials, /storage with uploaded files, and /vendor with dependencies. Attackers scan for this mistake specifically. Proper .htaccess configuration or correct document root setup prevents this exposure.
REST API Protection
Rate limiting, CORS, and authentication for API endpoints
If you're serving an API from Apache, .htaccess can implement critical security measures: rate limiting to prevent abuse, CORS headers for browser access, IP whitelisting for internal APIs, and authentication challenges. These operate at the web server level, protecting your application before requests reach your API code.
- Rate Limiting: Prevent abuse by limiting requests per IP (requires mod_ratelimit)
- CORS Headers: Allow cross-origin requests from specified domains
- IP Whitelisting: Restrict internal APIs to company IP ranges
- HTTP Authentication: Require username/password for API access
- Block Non-API Methods: Only allow GET, POST, PUT, DELETE—block TRACE, OPTIONS, etc.
Blocking malicious requests at the Apache level (before PHP executes) saves significant server resources. If your API receives 10,000 bot requests per day, handling them in .htaccess instead of your application code saves CPU cycles and reduces response times for legitimate users.
Testing Your .htaccess File: Don't Break Your Site
A misconfigured .htaccess file can take your entire site offline with a 500 Internal Server Error. One typo, one incorrect directive, one syntax error—and Apache refuses to serve any pages. This section shows you how to test safely, identify problems quickly, and recover from mistakes without panic.
Safe Testing Strategy
Test in development, validate syntax, deploy incrementally
Before making any changes: cp .htaccess .htaccess.backup
If something breaks, you can instantly restore: cp .htaccess.backup .htaccess
Never deploy .htaccess changes directly to production. Test on a staging server, local development environment, or subdomain first. Verify every rule works as expected before touching your live site.
Don't add 15 rules simultaneously. Add one, test it, add another, test that. If something breaks, you know exactly which rule caused it. This takes 10 extra minutes but saves hours of debugging.
After deploying, check your error logs: tail -f /var/log/apache2/error.log
Syntax errors appear immediately in the logs with specific line numbers and explanations.
Test edge cases: What happens with trailing slashes? What about uppercase URLs? Does HTTPS redirect work for all pages? Do redirects preserve query strings? Test for 5 minutes to catch problems before users do.
Common .htaccess Mistakes and Fixes
Learn from others' mistakes—here's what goes wrong most often
Symptom: Your entire site shows "500 Internal Server Error" immediately after uploading .htaccess.
- Syntax error (missing >, extra bracket, typo in directive name)
- Using a module that's not enabled (mod_rewrite, mod_deflate, mod_expires)
- Incorrect RewriteCond pattern syntax
- Conflicting directives
Rename .htaccess to .htaccess.broken (site comes back online immediately). Check Apache error logs for specific error. Fix the identified line. Test locally, then restore.
Symptom: Browser shows "This page isn't working - too many redirects" or similar error.
Your redirect rules create an infinite loop: Page A redirects to Page B, which redirects back to Page A. Or more commonly, your HTTPS redirect keeps triggering even after redirecting to HTTPS.
RewriteRule ^(.*)$ https:/example.com/$1 [R=301,L]This redirects every request to HTTPS, including requests that are already HTTPS, creating an infinite loop.
RewriteCond %HTTPS offRewriteRule ^(.*)$ https:/example.com/$1 [R=301,L]This only redirects if HTTPS is off, preventing the loop.
Symptom: You added redirect rules but they don't execute—pages still load normally or show 404.
- Check if mod_rewrite is enabled: Most redirects require this Apache module. Contact your host if unsure.
- Add RewriteEngine On: Without this directive, all RewriteRule directives are ignored.
- Check rule order: Rules execute top-to-bottom. A catch-all rule at the top might prevent specific rules below from running.
- Clear browser cache: Browsers cache 301 redirects aggressively. Test in incognito mode.
- Check for conflicting rules: An earlier rule with [L] flag stops processing, preventing later rules from executing.
Symptom: Site loads fine but noticeably slower after updating .htaccess.
- Complex regex in RewriteRules: Inefficient patterns can add 50-200ms per request
- Too many RewriteCond checks: Each condition adds processing time
- DNS lookups in rules: Never use external domains in rewrite conditions
- File existence checks:
RewriteCond %REQUEST_FILENAME !-fon every request adds disk I/O
Solution: Simplify rules, use specific patterns instead of catch-alls, cache heavily to reduce .htaccess processing frequency.
Production .htaccess Best Practices
Professional guidelines for maintaining .htaccess files long-term
Six months from now, you won't remember why you added a specific rule. Add comments explaining what each section does and why it exists. Future you (and your team) will thank you.
Keep .htaccess in Git alongside your code. Track changes, revert mistakes, and see the history of modifications. Treat infrastructure as code.
Group related rules together with clear section headers: Security Rules, Performance Rules, Redirect Rules. Logical organization makes maintenance easier.
Some shared hosts disable certain directives (Options, AllowOverride restrictions). Document what's allowed in comments so you don't waste time debugging provider limitations.
Every 6 months, review your .htaccess. Remove obsolete redirects, update outdated rules, and optimize for new Apache versions. Accumulated cruft slows down processing.
Use server logs and APM tools to track .htaccess processing time. If it's adding significant latency, move rules to the main server config (much faster) or simplify complex patterns.
Ready to Generate Your Optimized .htaccess File?
Use our visual generator to create production-ready .htaccess files in seconds. Select the rules you need, configure options, and download a fully commented, tested configuration.