Badge Webflow Award Winner 2023

Add a robots.txt file to your Webflow project

Tutorial to add a robots.txt file quickly on your Webflow website in order to optimize your natural referencing (SEO) and manage the indexing of your pages

Published on 
22/9/2022
-
Amended on 
11/4/2023
-
5 min
Robots.txt Webflow

Webflow is a very complete tool to optimize your website for SEO. We have native access to several parameters to allow our website to gain traffic without paying!

Here is a list with all the SEO optimizations to realize in your Webflow project to increase your online visibility.

One of them is to add a robots.txt file! In today's article, we will show you how to set up this file for your site and how to configure it properly.

What is a robots.txt file?

A robots.txt file is a text file that tells the indexing robots (crawlers) of search engines which pages they can and cannot crawl.

With the help of hints, you can tell the indexing robots not to analyze certain pages of your website.

Warning : You can forbid access to indexing robots, but not to users with the robots.txt

Why add a robots.txt (for SEO) ?

Organic referencing necessarily involves the indexing of pages on search engines. To be able to index a page of your website, search engines (such as Google, Bing, Yahoo, etc.) explore your content with their robots.

The interest of robots.txt is to tell these robots that some pages have no added value on the site and that there is no need for the indexing robots to come and crawl this page.

Even if it doesn't have a huge impact on your SEO, it will optimize your "crawl budget" (in short: the number of pages Google will crawl on your site). However, you will potentially allow your quality content to have more importance.

What does a robots.txt file look like?

The robots.txt files include different commands. However, for all these commands, you will have to add which robots will be affected(user-agent) and which pages should not be scanned(disallow). You can also allow pages to be analyzed with"allow".

If you don't know what instructions to give, we recommend that you allow the analysis and therefore the indexing of all your pages with the following robots.txt file:

User-agent: *
Disallow:

Here the asterisk (*) means that the instruction concerns all robots without exception. We do not put anything after Disallow because we want all our pages to be accessible.

Be careful not to add a / after Disallow (except in very specific cases) because it would block the indexing of all the pages of our website!

We can block the indexing of a page by adding its slug after Disallow. For example, I can block the indexing of my contact page by adding the following robots.txt file:

User-agent: *
Disallow: /contact

I can block all the pages of a folder from being indexed, for example a blog:

User-agent: *
Disallow: /blog/

Moreover I can authorize the indexing of one of the pages of the blog folder :

User-agent: *
Disallow: /blog/
Allow: /blog/post-1

Here is the official Google documentation about robots.txt to go further!

How to add a robots.txt in Webflow?

In Webflow, it is very simple to add a robots.txt. Just go to the general settings of your project, go to the "SEO" tab and add in "robots.txt" your file. You will then have to save your changes (Save Changes) and publish your project.

Robots.txt Webflow

How to check that its robots.txt is operational?

To check if your robots.txt is in place on your site, you can use a chrome extension like"Detailed SEO Extension" and click on "Robots.txt".

Robots.txt SEO Detailed Extension

Now you can optimize your robots.txt for your Webflow website! To have a customized and SEO optimized website, don't hesitate to contact our agency!

Discover a complete guide to Webflow SEO

Ready to take your website to the next level?

Improve your online visibility thanks to Digidop's experience, excellence and reactivity!

Stay in touch with Digidop news!