Google has announced one more way to help site owners request that a specific web page is crawled. The Fetch as Googlebot feature in Webmaster Tools has been around for a while but now makes it possible for site owners to submit a new or updated URL for indexing. The process is simple and doesn’t require users to start at the beginning of the crawling process. After you fetch a URL, and if the fetch is successful, you will now see the option to submit the URL to the Google index.
A URL will typically be indexed within a day, but this doesn’t mean that every URL submitted will be indexed. Once a page has been crawled Google will evaluate whether or not a URL belongs in their index. As with any type of discovery, ie: XML sitemap, internal and external links, RSS feeds, etc; Google goes through another process to determine whether to keep the page in their index.
This is a great function to use if you’ve just launched a new site, added new pages to a site, or updated important or time sensitive content on an existing page. You will no longer have to wait for Google to discover the page naturally. Keep in mind you will be limited to 50 single URL submissions per week. And submissions are limited to 10 per month when submitting a URL along with the links on that page going to other pages.
In addition to this update, Google has also updated the public “Add your URL to Google” form. It is now called the “Crawl URL form” and doesn’t require owner verification, but does still limit you to submitting 50 URLs per week.