It seems pretty standard to use cross validation to determine the best parameters. Of course, this is usually a time-consuming process. Are there any shortcuts? Are there other, faster, forms of exploratory analysis that can provide a hint as to which values will be best?
For example, at my current understanding of machine learning and SVM, I might do something like perform an initial grid search in the range of [10e-5, 10e5] at exponents of 10 for C, and then fine tune from there. But is there a way I could quickly estimate that the best C is somewhere between 10e3 and 10e5, and then perform more specific searches?
This question probably applies to most ML techniques, but I happen to be working with SVM right now.