0

I have a multi-language php application, and all the language data is currently stored in hundreds of small text files. Each text file is formatted like this:

en-US: hello
zh-CN: 你好
xx-YY: arbassdgtr

To output a page, the application reads multiple files. There is some caching (a script will avoid reading the same file twice), yet it looks pretty barbaric to me.

Wouldn't a SQLite DB be way faster and perhaps even less CPU intensive? Consider the application serves about 300 concurrent users, so there are literally thousands of text files being read at any given time.

Shouldn't be difficult. Just looking for best practice suggestions before getting the job done.

resle
  • 2,254
  • 4
  • 19
  • 37
  • possible duplicate of [Most efficient way to do language file in PHP?](http://stackoverflow.com/questions/1414079/most-efficient-way-to-do-language-file-in-php) – Scott Solmer Mar 25 '15 at 11:31
  • @resle, is my answer sufficient? would you accept it then? – Zuppa Jul 24 '15 at 07:45

1 Answers1

0

This application and it's performance is being discussed in the SQLite documentation:

Internal vs. External BLOB storage

The handling of text and BLOBs is not very different.

I am using SQLite for something similar, too, and it works pretty well. It is easier to package and as long everything is read-only, the performance should be good even with 300 users.

Some side note: you should consider creating an index on the language field and a prepared statement to look the texts up. Also, consider using the *16 variants to store real Unicode text.

Zuppa
  • 467
  • 5
  • 16