It basically boils down to: if I have 4000 files in a directory, the File.isDirectory() function takes 1ms to execute, so the directory takes 4s to compute (too slow [ 1 ]).
I haven't got the most complete knowledge of the filesystem, but I think that isDirectory() can be batched for all the elements in the directory (reading a chunk of data, and then separating the individual file's metadatas). C/C++ code is acceptable (it can be run with the JNI) but should be left as a last resource.
I have found FileVisitor, but it doesn't seem that it is the best solution to my problem, as I don't have to visit the entire file tree. I also found BasicFileAttributeView but it seems it has the same problem. This is a related question but there aren't answers that provide a significant solution.
[ 1 ]: Because it is not the only thing I do it ends up being like 17s.
Edit: Code:
internal fun workFrom(unit: ProcessUnit<D>) {
launch {
var somethingAddedToPreload = false
val file = File(unit.first)
....
//Load children folders
file.listFiles(FileFilter {
it.isDirectory
})?.forEach {
getPreloadMapMutex().withLock {
if (getPreloadMap()[it.path] == null) {
val subfiles = it.list() ?: arrayOf()
for (filename in subfiles) {
addToProcess(it.path, ProcessUnit(it.path + DIVIDER + filename, unit.second))
}
getPreloadMap()[it.path] = PreloadedFolder(subfiles.size)
if (getPreloadMap().size > PRELOADED_MAP_MAXIMUM) cleanOldEntries()
getDeleteQueue().add(it.path)
somethingAddedToPreload = somethingAddedToPreload || subfiles.isNotEmpty()
}
}
}
...
if(somethingAddedToPreload) {
work()
}
}
}
private fun addToProcess(path: String, unit: ProcessUnit<D>) {
val f: () -> Pair<String, FetcherFunction<D>> = { load(path, unit) }
preloadList.add(f)
}
private suspend fun work() {
preloadListMutex.withLock {
preloadList.forEach {
launch {
val (path, data) = it.invoke()
if (FilePreloader.DEBUG) {
Log.d("FilePreloader.Processor", "Loading from $path: $data")
}
val list = getPreloadMap()[path]
?: throw IllegalStateException("A list has been deleted before elements were added. We are VERY out of memory!")
list.add(data)
}
}
preloadList.clear()
}
}
PS: I will remove the coroutines in work before implementing an optimization, complete code is here.