KB Robots.txt 1.0.1
File ID: 93509
KB Robots.txt 1.0.1
File Size: 10.0 KB
KB Robots.txt 1.0.1 Description
Description: When robots (like the Googlebot) crawl your site, they begin by requesting http://example.com/robots.txt and checking it for special instructions. Use this plugin to create and edit your robots.txt file from within WordPress (using Options -> Robots.txt).
Whenever a user (or a robot, more likely) appends "robots.txt" to your blog URL (e.g. http://blog.example.com/robots.txt), this plugin will serve up the robots.txt file that you created in the WordPress admin menu.
This plugin should work with most versions of WordPress, but it is particularly intended for WP-MU installations, since it allows each WPMU blog to have a unique robots.txt file.
Note that robots make only top-level requests for robots.txt files. If you have WordPress installed in a subdomain (e.g. http://blog.example.com/) or in your root (e.g. http://example.com/), this plugin will work as intended. But if you have WordPress installed in a subdirectory (e.g. http://example.com/blog/), then this plugin won't do much for you, since the search engines won't look for http://example.com/blog/robots.txt, only for http://example.com/robots.txt.
Also, requires that you be using some form of permalink structure. If links to blog posts look like http://example.com/index.php?p=36, this won't work.
If you post your support questions as comments below, I probably won't see them. If the FAQs don't answer your questions, you can post support questions at the KB Robots.txt plugin page on my site.
1. Upload kb_robots-txt.php to the /wp-content/plugins/ directory.
2. Activate the plugin through the 'Plugins' menu in WordPress.
3. Go to the new 'Options => KB Robots.txt' admin page. In the box, write whatever you want your robots.txt file to say. There are examples below the box.
4. Test that it worked by going to http://yourblog.example.com/robots.txt to see your new robots.txt file.
Related: Plugin, robotstxt, Wordpress, won, Questions, Support, installed, Admin, intended, Robots, worksupportif, Comments, posts, requires, httpexamplecomrobotstxtalso, httpexamplecomblogrobotstxt, Txt, Scripts, kb, txt scripts
O/S:BSD, Linux, Solaris, Mac OS X
File Size: 10.0 KB
|More Similar Code|
The robots.txt file is the first thing search engine "spiders" look for when indexing a website. The robots.txt file tells search engine spiders (robots) which files and/or directories they are NOT allowed to index. This helps to prevent incomplete site indexing as well as prevent exposing the files and directories that you don't want "all over the world wide web". It's fast, easy and FREE.
The robots.txt generator is a script that is created using PHP programming with which you will be able to browse through the directory structure of your web server. And if a robots.txt is found, the corresponding folders are checked. In addition,...
Make Search Engine Friendly Robots.txt Files Fast and Easy! Pattern matching. Visit Time. Request Rate. Crawl delay, Sitemaps, and more. Add information quickly and easily. Import exiting existing robots. txt file for editing and save it to your...
Reads and parses /robots.txt files.
robots/text generator is a multi-platform compatible script. It runs a check if robots.txt exists in your server's directory structure. If found, it is read and the appropriate allowed/disallowed folders are checked/unchecked.
Use this module when you are running multiple Drupal sites from a single code base (multisite) and you need a different robots.txt file for each one. This module generates the robots.txt file dynamically and gives you the chance to edit it, on a...
Generates dynamic Google-Sitemaps/ XML-Sitemaps, robots.txt file and metatags for SEO (Search Engine Optimization).
JillyBoel Spambot Foiler is a counter to Spambots that attempt to collect email addresses from your website. This is achieved by providing invalid email addresses to the spambot. This is carried out by creating and feeding invalid email addresses....
Onix is a high performance full-text indexing and retrieval engine toolkit. Onix provides access to one of the fastest text indexing engines in the industry through an easy to use "C" API. Onix provides support for both ASCII and Unicode...
Sitemap Writer Pro is an easy- to -use program, fast and efficient, which offers 6 different types of sitemaps (Standard Sitemap, Google News, Google Video, Google Code Search, Google Geo, Google Mobile) to properly index every website, generate...
|User Review for KB Robots.txt