Using Robots.txt To Help Google Images Index Your Website

RobotsThere’s allot of webmasters out there who utilize the power of the robots.txt file, and use it to block search engine spiders from crawling their cms folders. At JVF we use this method in order to not reveal .js, .css, or any custom coding we use to the search engines. When implementing the Disallow feature within robots.txt we want to warn you that any images within the folder you Disallowed will not be indexed properly within Google Images. To ensure that the photos and images on your website are crawled and indexed properly, be sure to use the Allow rule. You can see an example of this when you take a look at what Google is doing in their robots.txt file. http://www.google.com/robots.txt

They first Disallow anything within their safebrowsing folder, then Allow specific folders which they want crawled and indexed.

screenshot-www jvfconsulting com 2015-02-20 12-09-00
If you were to use this same scenario for your own website using a content management system, it should look something like this:

screenshot-www jvfconsulting com 2015-02-20 12-09-34