We are looking to have a Solr 4.9 setup where we have a very simple crawler wipe out and load up a "crawler" core then trigger a copy of the data over to "search" core when the crawl is done. The purpose of this is that our crawler is VERY simple, and doesn't really track documents in a way that would be conducive to doing updates and deleted. Basically, the crawler will be wiping out the entire "crawler" core, ripping though about 50k documents (committing ever 1000 or so), and then trigger something to copy over the data to the other "search" core.
Assuming we would have to restart the Search core, how could this be made possible from command-line or code?