I have a vocabulary with different words and information about them. It's about 100MB in size. Searching this file takes a very long time, however. Is there any way to improve the speed at which I can lookup the data? For example, I was thinking of writing a program that would split the text file into 26 different text files (by the first letter of the word) and then, the program would just need to check the first letter of the given word and would have a much smaller file to search. Will this improve the execution time of the program? Are there any efficient data structured I could store the file in? Like json, for example. Also, what about databases? I'm using Kotlin/Java.
Edit: So far, I've just brute-force searched the entire file until I found a match. But, as I said, the file is >100MB. The execution of the program is about 5 seconds and that's searching for just one word. In the future, I want the program to search easily for 100 words in milliseconds, optimally. Like text editors like Word search for words in their vocabularies.