Interesting to read that the Google, Microsoft and Yahoo! have collaborated on standards for Robots Exclusion Protocol (REP) – in simpler terms, the code you use to tell search engines not to spider a particular page or section.
The full details are available here.
Similar to Google’s guidelines for Webmasters, it would be helpful to have a set of consistent standards for the basics of search engine optimisation across the search engines. Obviously each search engine has its own closely-guarded algorithm that determines rankings, but with consistent guidelines on sitemaps, use/abuse of keywords and clear do’s/don’ts would be a big help in putting the emphasis firmly on supporting white hat practices.
Leave a Reply