If you suspect that any data at any of the seeds that you crawled when creating a search collection has changed, you can refresh the results of the crawl of those seeds. When you refresh a crawl, you are instructing the crawler to locate newly linked pages, remove pages that are no longer linked, and update pages that have been modified. To do this, a complete crawl is started that checks several options before crawling each seed.
To refresh the crawl of the seed used in this tutorial, open another window and return to the Overview page. Next, click refresh to refresh the crawl. If you choose to start the refresh in staging, both the Staging Status and Live Status tabs are displayed until the indexing completes in staging and automatically pushes to live. If any of the tests fail during staging, the Staging Status tab will continue to display.
Some crawling options are set globally for a search collection, while others are defined by adding or refining Conditional Settings (described in Crawler Configuration). Global options are set by selecting a search collection's Configuration section, selecting the Crawling tab, and clicking edit beside the Global Settings heading.
While the crawl is being refreshed, the Live Status tab will show the amount of data that has been crawled and the amount of data that was actually downloaded. Return to the search page and you'll see that the crawling has not disrupted the existing search. As before, indexing is performed at the same time as the crawl. If you have refreshed the data in staging, the new staging data is automatically made live and visible to your users as soon as indexing finishes, and the collection tests are passed. The Query service is not interrupted while the data is being made live. Users will always be able to search successfully.
To proceed with the tutorial, click Crawler Configuration.