A Whole Blog in Robots.txt - Broke all Normal Blogging Trends

Status
Not open for further replies.

mrintech

Technomancer
All of us use Wordpress, Blogger, MySpace, Live Journal etc. etc. for our Blogging Purpose. Those who are using self hosted Wordpress Blogs know about robots.txt
*i37.tinypic.com/2nsqs0i.jpg​

As Taken from Wikipedia
A robots.txt file on a website will function as a request that specified robots ignore specified files or directories in their search. This might be, for example, out of a preference for privacy from search engine results, or the belief that the content of the selected directories might be misleading or irrelevant to the categorization of the site as a whole, or out of a desire that an application only operate on certain data.
For websites with multiple sub-domains, each sub-domain must have its own robots.txt file. If example.com had a robots.txt file but a.example.com did not, the rules that would apply for example.com will not apply to a.example.com.
Now what, a guy from Webmasterworld broke all Normal trends of blogging. Instead of Blogging on Wordpress, Blogger etc. he started blogging in robots.txt file!
This is an utmost innovative idea. People uses robots.txt for telling search engines which content to exclude and which to include, but this guy completely transformed robots.txt file into his Blog!


Source: *tech-baby.co.cc/a-whole-blog-in-robotstxt-broke-all-normal-blogging-trends
 
Status
Not open for further replies.
Top Bottom