I have the following function which I am using to loop through a directory of PDF files, and create a thumbnail for each:
<cffunction name="createThumbnails" returntype="Void" output="false">
<cfscript>
// CONSTANTs
var _PDF_PATH = APPLICATION.PDFSource & "\PDFs";
// Set defaults for private variables
var _qPDFDir = QueryNew("");
var _documentId = 0;
var _sourceFilePath = "";
var _sku = "";
var _tempImageFilePath = "";
</cfscript>
<!--- Retrieve a list of file names in the directory of unprocessed PDF files --->
<cfdirectory
action="list"
directory="#_PDF_PATH#"
name="_qPDFDir"
filter="*.pdf"
type="file"
sort="datelastmodified DESC"
listinfo="name" />
<!--- Loop through the list of file names in the directory of unprocessed non-PDF files --->
<cfloop query="_qPDFDir" endrow="500">
<cfset _sourceFilePath = _PDF_PATH & "\" & name />
<cfif FileExists(_sourceFilePath) AND IsPDFFile(_sourceFilePath)>
<cftry>
<cfpdf
action="thumbnail"
source="#_sourceFilePath#"
destination="#APPLICATION.TempDir#"
format="png"
scale="100"
resolution="high"
overwrite="true"
pages="1" />
<cfcatch>
<cfscript>
FileMove(
_sourceFilePath,
_PDF_PATH & "\NonFunctioning\"
);
</cfscript>
</cfcatch>
</cftry>
<cfscript>
_documentId =
REQUEST.UDFLib.File.getFileNameWithoutExtension(name);
_tempImageFilePath =
APPLICATION.TempDir
& "\"
& _documentId
& "_page_1.png";
if (FileExists(_tempImageFilePath)) {
_sku = getSkuFromDocumentId(_documentId);
if (Len(_sku)) {
CreateObject(
"component",
"cfc.products.Product"
).setClientId(
getClientId()
).setId(
_sku
).createThumbnails(
sourcePath = _tempImageFilePath,
deleteSourceFile = true
);
FileMove(
_sourceFilePath,
APPLICATION.ProcessedPDFDir
);
}
}
</cfscript>
</cfif>
</cfloop>
<cfreturn />
</cffunction>
Some of the code is not what I would like to do - i.e. moving files to the "NonFunctioning" directory but it is required by the business rules.
What I'm trying to figure out is, how can avoid using up memory when this function runs?
With endrow="500"
, it bombed with a java.lang.OutOfMemoryError: GC overhead limit exceeded
error after processing about 148 files.
And I can see the memory just increase and increase when I watch the jrun.exe process in Task Manager (Windows).
Is there any way I can improve the performance of this function to prevent memory from being eaten up?