If we will want to specify our some of the pages will unavailable after certain time then what we will do ?
Ok.SoGoogle have a feature about the new unavailable after meta tag announced by Dan Crow on the Official Google Blog . This allows a more dynamic relationship between your site and Googlebot. Just think, with www.abc.com, any time you have a temporarily available news story or limited offer sale or promotional page, you can specify the exact date and time you want specific pages to stop being crawled and indexed.
Let's assume you're running a promotion that expires at the end of 2007. In the headers of page www.abc.com/2010promotion.html, you would use the following:
And the second exciting news: the new X-Robots-Tag directive, which adds Robots Exclusion Protocol (REP) META tag support for non-HTML pages! Finally, you can have the same control over your videos, spreadsheets, and other indexed file types. Using the example above, let's say your promotion page is in PDF format. For www.example.com/2010promotion.pdf, you would use the following:
X-Robots-Tag: unavailable_after: 01-Dec-2010 23:59:59 GMT
Remember, REP meta tags can be useful for implementing noarchive, nosnippet, and now unavailable_after tags for page-level instruction, as opposed to robots.txt, which is controlled at the domain root. We get requests from bloggers and webmasters for these features, so enjoy. If you have other suggestions, keep them coming. Any questions? Please ask them in the Webmaster Help Group.
I really appreciate for sharing this post expecting more information form your blog.Search Engine optimization
ReplyDelete