I have used UtraEdit in the past, but was not entirely impressed for the price of $60 when the only aspect of it I enjoyed (over notepad++) was it's large file adaptability.
What free programs are good at reading/editing massive text files?
I have used UtraEdit in the past, but was not entirely impressed for the price of $60 when the only aspect of it I enjoyed (over notepad++) was it's large file adaptability.
What free programs are good at reading/editing massive text files?
I use vim so I recommend it. It can perfectly handle big files.
However if you want to edit big files sequentially the most powerful way to do it is by using a programming language which has an input/output capability: python, perl, C, Golang and many many others.
For python see this answer: Lazy Method for Reading Big File in Python?
For editing you might consider regular expressions.
And (at least on Linux) if you just want to just look into the huge files, not change them, you could also use pagers like the more
, less
, most
commands.
If you can modify the application handling these files, you should make it able to construct a huge content out of smaller files. A simple trick might be that if a "file name" starts with some special character like eg !
or |
you use popen
& pclose
instead of fopen
& fclose
, that is something like
FILE* input;
bool ispipe;
char* filename;
//... get the filename; open the file or pipe
if (filename[0] == '!') {
input = popen(filename+1, "r");
ispipe = true;
}
else {
input = fopen(filename, "r");
ispipe = false;
}
//... process the input using sequential I/O ...
// close input:
if (ispipe)
fclose(file);
else
pclose(file);
Then you might pass '!cat foo1 foo2 foo3'
as the program argument giving filename
.
And you did not define what you call a huge file: on a common Linux desktop with x86-64 processor (and 64 bits linux) and 8 gigabytes of RAM, you can edit a 4Gb file with standard editors (but I recommend avoiding that situation).