Application: I wish to publish a web-application that takes input strings, searches for the string in about 5,000 plain-text files and returns the name of the files with matches. Each text file is about 4MB (uncompressed).
Problem:
In PHP, I could use exec(grep -l pattern dir/* )
and get the job done. However, for cost reasons, I would go for a shared web-hosting plan which normally do not allow for executing programs.
Could you please suggest any alternative to grep for web environment?
I have understood following so far:
A binary program file for any grep-alternative (e.g sift) could work. However, the problem of executing on a shared server would remain.
PHP function
preg_match
is inappropriate considering a large number of files and their size.
I am open to implementations of grep-like function in other languages (e.g perl or javascript). However, I am not sure if the performance would be comparable to grep and whether the problem of execution would still remain.
I have tried looking for different web-hosting providers and understood that a virtual-private server (VPS) might be the solution. However, the price for a VPS plan by all hosting providers I have come across is unaffordable.
Any solutions or guidance for this problem?