You're asking about a couple of different things here.
Undefined behavior exists because there are things which aren't legal in C but which for various reasons it is prohibitively difficult or even downright impossible for a compiler to warn you about. For example, if you write
int a[10];
char *p = a;
for(int i = 0; i < 20; i++)
*p++ = i;
it is very hard for a conventional C implementation to detect (either at compile time or at run time) that you have done something very wrong. Therefore, it's your job not to do this: the compiler isn't obligated to generate code that works, nor is it obligated to give you an error message telling you that the program won't work.
Translation limits exist because no computer program, including a C compiler, can do everything, or access infinite amounts of memory. There will be C programs that a given compiler can't compile, not because the program contains an error, but simply because it is "too big" in some way.
Suppose your compiler has a data structure — an array — containing one element for each source line of your program. And suppose that the programmer of your C compiler was too lazy to make it a dynamically-allocated array. Suppose, for example, that the array is declared of size 1,000, meaning that you can't compile a C source file of more than 1,000 lines.
This would be a poor strategy, because it fails to honor the Standard's recommendation that implementations "avoid imposing fixed translation limits whenever possible". But that's not the question — the question for today is, with that compiler, what happens if you try compiling a 1,001-line source file?
If the compiler did the moral equivalent of that earlier code fragment I wrote, by doing something like
struct sourceline source[1000];
struct sourceline *p = source;
while(!feof(ifp))
*p++ = parseline(ifp);
then, yes, if you tried to compile a 1,001-line source file, something undefined would happen. The compiler might corrupt its internal data structures and generate bad code for you. Or the compiler itself might crash.
But now we get to a third thing that the Standard talks about, but you didn't mention: quality of implementation issues. A compiler that not only had a fixed limit on the size of your source file, but that crashed or did something undefined if you exceeded it, would be an exceedingly poor quality of implementation. If a C program — including a C compiler — has a fixed-size array in it, then detecting and preventing possible overflow of that array is not "prohibitively difficult or even downright impossible". It is, rather, an ordinary, everyday, bread-and-butter task that every competent C programmer — and certainly a C programmer who's writing a C compiler! — must be capable of.
So, bottom line, this is a quality of implementation issue: I would posit that any decent-quality C compiler, that had a fixed-size translation limit, would treat a breach of that limit as an explicitly diagnosable error, not as silent undefined behavior.
[Footnote: Yes, while(!feof(ifp))
is always wrong. That was an example of bad code, so I didn't worry that it also had that other egregious error in it.]