In a November 17 blog post, Andrew Boni, Inside AdSense team shared some tips to help publishers optimized their AdSense ads using the robots.txt file.
Boni notes,”Your site’s robots.txt file essentially acts as a gatekeeper that determines which web crawlers, web robots, and search engines have access to your site and which do not. AdSense ads are displayed through the use of an AdSense web crawler. That crawler scans your page’s content and determines which ads to display, according to specific keywords. If the AdSense crawler is being blocked by a robots.txt file, we’re going to have a difficult time displaying relevant ads on your site. As a result, your users may see less relevant ads, which can lead to a lower CTR.”
If you have URLs with any errors, you can see what they are by logging into your AdSense account and clicking on ‘Account Settings’ from the home page. From there, click on ‘View errors’ under ‘Access and Authorization.’
Here is how you can help yourself – per Boni’s post:
View the contents of your robots.txt file by going to [yourdomain.com]/robots.txt. Be sure that the file is configured to allow AdSense ad crawler to view your site. You can do that by simply adding the following two lines to the very top of the file to ensure that AdSense ad crawler can access your site and will help display more relevant ads. As a result, you can potentially benefit from increased ad revenue.
Note: Adding these two lines to your robots.txt file will only help to deliver better, more relevant ads to pages with AdSense code already on them.
User-agent: Mediapartners-Google Disallow: