I have a multi-language php application, and all the language data is currently stored in hundreds of small text files. Each text file is formatted like this:
en-US: hello
zh-CN: 你好
xx-YY: arbassdgtr
To output a page, the application reads multiple files. There is some caching (a script will avoid reading the same file twice), yet it looks pretty barbaric to me.
Wouldn't a SQLite DB be way faster and perhaps even less CPU intensive? Consider the application serves about 300 concurrent users, so there are literally thousands of text files being read at any given time.
Shouldn't be difficult. Just looking for best practice suggestions before getting the job done.