Google Webmaster added a new functionality to “Fetch as Googlebot” feature that now provides a way to submit new and updated URLs to Google for indexing. “After you fetch a URL as Googlebot, if the fetch is successful, you’ll now see the option to submit that URL to our index. ,” informed a team of Webmaster Trends Analysts.
“This new functionality may help in several situations: if you’ve just launched a new site, or added some key new pages, you can ask Googlebot to find and crawl them immediately rather than waiting for us to discover them naturally. You can also submit URLs that’re already indexed in order to refresh them. It could also help if you’ve accidentally published information that you didn’t mean to, and want to update cached version after you’ve removed the information from your site.”
Here’s how to submit a URL:
The team explains, “First, use Diagnostics > Fetch As Googlebot to fetch the URL you want to submit to Google. If the URL is successfully fetched you’ll see a new “Submit to index” link appear next to the fetched URL.
Once you click “Submit to index” you’ll see a dialog box that allows you to choose whether you want to submit only the one URL, or that URL and all its linked pages. When submitting individual URLs, we’ve a maximum limit of 50 submissions per week; when submitting URLs with all linked pages, the limit is 10 submissions per month. You can see how many submissions you have left on the Fetch as Googlebot page.
When you submit a URL in this way Googlebot will crawl the URL, usually within a day. We’ll then consider it for inclusion in our index. Note that we don’t guarantee that every URL submitted in this way will be indexed; we’ll still use our regular processes–the same ones we use on URLs discovered in any other way–to evaluate whether a URL belongs in our index. Any URL submitted should point to content that would be suitable for Google Web Search, so if you’re trying to submit images or videos you should use Sitemaps instead.”
Submit URLs to Google without verifying:
Also updated the public “Add your URL to Google” form. It’s now the Crawl URL form. It has the same quota limits for submitting pages to the index as the Fetch as Googlebot feature but doesn’t require verifying ownership of the site in question, so you can submit any URLs that you want crawled and indexed.
[Source: Webmaster Central blog]