Using .htaccess for SEO – Redirects, URLs, Content Scanning

1 Makaleyi dinle

Of course, it is not surprising that one of the most important tasks of search engine optimization (SEO) is about making your website search engine friendly. Competition in search engine results is high, so no advantage should not be ignored.

Htaccess is one of your important files that allows you to control the functionality of your website and how it interacts with servers and browsers. We need to customize our .htaccess files not only for security and control purposes, but also to maximize search engine optimization. In configuring .htaccess for SEO, there are features that you need to enable or disable as per your requirement.

.Htaccess, customized for better search engine optimization results, will cover every single aspect of SEO. We will also talk about .htaccess file snippets that increase the functionality of htaccess, and discuss how regular expressions are used in htaccess. First of all, let’s get to know this file, what is htaccess?

What is Htaccess?

Especially beginners may need a better understanding of what htaccess is. It’s a little different from your other files in the directory; an extension without a name? Htaccess is a file, not a file extension.

Technically, htaccess is used by web servers powered by Apache to configure each directory. Also, if your hosting company doesn’t allow you to edit the .htaccess file, you can easily create a new one and override the parameters in your other default .htaccess file.

The directives can override certain properties at the directory and subdirectories level, so uploading an .htaccess file to your web server’s root folder means the directives will apply to your entire website.

The file is easy to access. All you have to do is search for the file in the root directory. Most of the popular content management system offers access to the file from their own management panels. You can also access the file via FTP.


Directives give you more precise control over the directories you want to configure. These are commands in the configuration file. It uses htaccess derivatives. With these variants, you can password protect files, control browsing, allow or ban IP addresses.

What does Htaccess SEO do?

Search engine crawlers are interested in how you control the functionality of the website on .htaccess, and you can get a positive score, as correct configuration means reliability.

With .htaccess, you can create clean URLs that search engines love.
You can resolve 404, https errors and check your 301 redirects. You can also use IP address or domain users to block illegal search engine bots.

Editing .htaccess is not difficult. It improves the functionality and features of your site, but incorrect configuration will fail your site. You may receive 500 internal server errors. To avoid such situations, you should back up your .htaccess file before making any changes. In this way, your original file is safe and you can experiment.

Where is the .htaccess file?

Depending on which platform you are using, the .htaccess file is often located in the root of your directory. For example, if you are using WordPress, you will find it in the main directory of the WordPress installation. To access the .htaccess file, you need to turn on “Show Hidden Files” feature.

SEO Friendly URLs

URLs play an important role in search engines ranking. According to Google’s Matt Cutts, the structure of URLs plays an important role in ranking and adds value to the ranking factor of keywords in URLs. Another essential feature for URLs to be search engine friendly is to be as short as possible. The fact that URLs are both short and predictable how they will lead to a page when clicked will signal to search engines that you are interested in your users.

General URL Optimization

Let’s start with a simple example, you can change the entire URL structure on your website with the following statement.

RewriteEngine On
Rewrite Rule ^test/ (a[a-zA-Z0-9]+)$ index.php?topic=$1

# Yukarıdaki ifade ile web sitenizdeki tüm URL'ler
#  biçimine dönüşür.

Removing file extensions from the URL (.php, .html)

With htaccess, you can also get rid of page extensions like .html and .php. These extensions do not add any value to the user. To remove extensions, simply copy and paste the code below into the .htaccess file.

RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f 
#PHP Uzantısını siler

RewriteCond %{REQUEST_FILENAME}.html -f 
Rewrite Rule ^(.*)$ $1.html
#HTML uzantısını siler

Canonical robots.txt

You can improve robots.txt crawling with canonical routing. Generally, the robot.txt file is located in the root directory, but bad bots or other malicious scripts can consume your resources by scanning the entire website to find your robots.txt file.

You can use .htaccess for bots and other crawlers to find the Robot.txt file. Also, you can prevent continuous requests to access the “robot.txt” file. By doing this, you reduce the load on the server and simplify the job of important browsers like GoogleBot.

Robots.txt canonical routing

<IfModule mod_rewrite.c> 
RewriteBase / 
RewriteCond% {REQUEST_URI}! ^ / Robots.txt $ [NC] 
RewriteCond% {REQUEST_URI} robots \ .txt [NC] 
RewriteRule % {REQUEST_URI} robots \ .txt [NC] RewriteRule. * [ R = 301, L] 

In the code above, all you have to do is change your “website” to your site’s URL. The URL must be the publicly accessible root directory of your website. In the above method we used Apache’s rewrite module. However, if you are looking for an alternative solution, mod_alias you can use.

RedirectMatch 301 ^/(.*)/robots\.txt

URL www / http version redirect

One of the main problems is www, non-www redirects. With this redirect, you won’t have to deal with URL duplication in Google Search Console. You can fix the problem with a simple canonical tag.

JavaScript / jQuerry

// HTTP yönlendirmenin bir benzeri
// Linke tıklama
window.location.href = "";


RewriteEngine On
RewriteBase /
Rewritecond %{HTTP_HOST} ^www\.site\.com$ [NC]
RewriteRule ^(.*)$$1 [R=301,L]

You can try using possible variations in the .htaccess file when forcing www or ssl, for example:

RewriteEngine On
RewriteBase /

Force WWW

RewriteCond %{HTTP_HOST} ^site\.com
RewriteRule (.*)$1 [R=301,L]

Force SSL

RewriteCond %{SERVER_PORT} 80 
RewriteRule ^(.*)$$1 [R,L]

IP Blocking

Order Deny,Allow
Deny from
Deny from

Apache 2.4 Updated Version

RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^www\.(.*)$
RewriteRule ^(.*)$ https://%1/$1 [R=301,L]

PHP Method

$protocol = (@$_SERVER["HTTPS"]    == "on") ? "https://" : "https://";

if (substr($_SERVER['HTTP_HOST'], 0, 4) !== 'www.') {
    header('Location: '.$protocol.'www.'.$_SERVER    ['HTTP_HOST'].'/'.$_SERVER['REQUEST_URI']);

Ajax Forwarding

    type: "POST",
    url: reqUrl,
    data: reqBody,
    dataType: "json",
    success: function(data, textStatus) {
        if (data.redirect) {
            window.location.href = data.redirect;
        else {

Redirect to FeedBurner

Redirecting your posts to Feedburner can help boost your SEO. With .htaccess you can automate the whole process. We will use the mod_rewrite module to achieve the desired result.

  RewriteCond% {REQUEST_URI} ^ / feed / [NC] 
  RewriteCond% {HTTP_USER_AGENT}! (FeedBurner | FeedValidator) [NC]  RewriteRule. * Http://feeds.feedburnerFr 
 L, R = 302]
  RewriteCond% {REQUEST_URI} ^ / comments / feed / [NC] 
  RewriteCond% {HTTP_USER_AGENT}! (FeedBurner | FeedValidator) [NC] 
  RewriteRule. * Http:// [L, R = 302] 
 < / IfModule>

To customize the code above to your website, click the “ allCommentsFeed “and” mainContentFeed “items” FeedBurner “you need to change it with your values.