I've been studying some sorting algorithms and have come across some inverse relationships between time and space complexity. For example, an algorithm like selection sort takes O(n^2) but only requires constant space as it can be done in place. An algorithm like merge sort however has O(nlogn) time complexity but requires O(n) space.
My questions are
Is there a theorem or law that relates time and space complexity trade offs to one another? Is this phenomenon only present is sorting algorithms or does this trade off also persist in other problems?
Is it always a good idea to trade space complexity for time complexity with a huge increase in modern RAM sizes? Or is there times when decreasing the time complexity would make the space complexity prohibitively large.