You probably can't come up with a definitive list. For example, I did some tests a while back with searching sorted lists in .NET. With a sorted list of integers, binary search turned out to be faster than sequential search when the number of items was 13. With a sorted list of strings, that number was 8. With other types for which comparison was more expensive, the number was even smaller.
Running the same test using a different language or runtime library will give you different numbers. It could even depend on the memory access hardware and probably some other hardware considerations.
The conventional wisdom was (perhaps still is) that sequential search was so much simpler than binary search that the reduced complexity gave it a large advantage on small lists. The truth today is that CPU speeds and memory access are so fast that the simplicity of sequential search is a factor only when the lists are very small.
At best you can come up with a definitive set of rules that apply to one runtime configuration on specific hardware when comparing a specific data type. If you change environments or change data types, you have to write tests to benchmark it all over again.