Knowledge Base

Robots.txt Content Type

Last Modified:
06 Apr 2026
User Level:
Administrator

A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests. This article demonstrates how to create a robots.txt Content Type in Terminalfour.

You must be an Administrator to complete this action.

Create a new Content Type

You'll need to create a new Content Type by going to Assets > Content Types and clicking on the "Create content type" button. Fill in the settings shown below.

Name Description Minimum User Level
Robots.txt file Used to add Robots.txt file to the current section, used in the root/home section. Administrator

Add Elements

Once on the Elements tab, click the "Add Element" button at the bottom. Fill in the settings shown below. Then click "Save Changes".

Element Description Type Characters Required Show
Name The Name Element Plain Text Default Yes Yes
Robot File Enter the code you wish to be output in the robots.txt file. Plain Text 2000 Yes Yes

 

Create a robots-content content layout

  1. On the Content Layout tab, click on the "Add content layout" button. Fill in the General settings as shown below.
Label Value
Name text/robots-content
File extension (Default)
Syntax Type HTML/XML
Content Layout Processor Handlebars Content
  1. Click the "Content Layout Code" Tab and enter the following: {{publish element="Robot File"}}
  2. Click "Save Changes"

Create a default content layout

  1. On the Content Layout tab, click on the "Add content layout" button. Fill in the General settings as shown below.
Label Value
Name text/html
File extension (Default)
Syntax Type HTML/XML
Content Layout Processor Handlebars Content
  1. Click the "Content Layout Code" Tab and enter the following: {{nav id="XXX" name="Create Robots.txt"}}
  2. Click "Save Changes"

Note: This Content Layout contains a Navigation Object. We'll modify this later.

Create the Navigation Object: Create Robots.txt

  1. You'll need to create a new Navigation by going to Assets > Navigation and clicking on the "Add new navigation" button.
  2. This uses the Generate file Navigation Object. Set it up using the options in the table below. Unlisted or empty options should be left at their default values.
  3. Copy the ID generated (or Handlebars tag)
Label Value
Name

Create Robots.txt

File Name robots
Append the content ID to the name of the file  
File extension txt
Output directory Use the current directory
Append the current section path to the base directory  
Content layout text/robots-content

Update content layout with new Navigation

Once you have the Navigation ID, we'll have to go back and update the default content layout.

  1. Go to Assets > Content Types
  2. Search for your new "Robots.txt file" content type.
  3. Click "Actions"
  4. Click "Edit content layouts"
  5. Choose "text/html"
  6. Change the ID to be the ID you copied in the previous step

Create the file

Now that you've finished your new content type, we need to create the actual robots.txt using it. To do this, be sure you're in your home section (The file needs to be found at the root of your website).

  1. Go to your Site Structure
  2. And navigate to your Home section
  3. From there, click on the "Content types" tab
  4. Search for "Robots.txt file"
  5. Enable the Robots.txt file content type for the Section
  6. Switch to the "Content" tab
  7. Click "Add content"
  8. Add a piece of content using the "Robots.txt file" content type
  9. Enter Name: "Robots.txt"
  10. Paste your rules

More information regarding Robots.txt and the rules you can use can be found at Google's Robots.txt Introduction and Guidance page.

Back to top