Latest news
Hello World
August 24, 2016

Welcome to Moderatobot - automatic websites moderator...
Subscribe to news
All news | Follow us on    RSS    Facebook    Twitter 

August 24, 2016

Hello World

Today is the start of Moderatobot project

Moderatobot is a moderator of websites. The main aim of the service is to release webmasters and site moderators from the routine to save a lot of time for more essential affairs.

At its final version Moderatobot will be able to perform constant automatic SEO and content moderation of websites.

There is still much to do, but you can already use Moderatobot for checking Robots.txt files.

Check Robots.txt file

By using the Robots.txt file webmaster can control the behavior of search engines robots (crawlers) when indexing his site. The entire site or some its area can be closed from indexing. The robots can be informed about such important information as Sitemap files, expected delay between two subsequent accesses the site, etc.

Errors in robots.txt file and its deviations from the standards can cause problems with the indexing of your websites by some search engines.

Moderatobot checks robots.txt file in order to identify potential problems that search robots might face when processing your robots.txt files. And then offers its own version of the robots.txt file that will best meet the standard and can be equally understood by most major robots.

Robots.txt theory  |  How to check robots.txt  |  Start checking
We appreciate every opinion which can help to improve the service.
Name
Email
Subject
Text