WordPress possesses a virtual robots.txt file which means that no actual file is present on the web server, WordPress generates this file each time someone visits your site.

A robots.txt file is a good idea to make your site search engine friendly and is often a core component in Search Engine Optimisation (SEO) plugins.

You do however want to limit what is indexed by the search engines so you must disallow access to certain folders such as wp-admin, wp-includes etc.

The easiest way to do this is to create a robots.txt file in a text editor and place the file in the root folder of your WordPress instance i.e. the folder where wp-admin, wp-content and wp-includes is located.

The robots.txt file created here will take preference over the virtual once created by WordPress and will therefore give you more control. Certain themes contain robots.txt files which are enabled when you activate the theme in WordPress so make the changes in the theme’s robots.txt file if this is the case.

Below is an example of a robots.txt file I created on my demo WordPress server using nano.

Comments are closed.