I'm developing an application which reads Exif data from a JPEG file. The data is stored in a struct as below:
struct Metadata{
int tagID = 0;
std::string tagIDHex;
int ifdNo = 0; // 0=IDF0, 1= Exif, 2=GPS, 3=Interop, 4 = IFD1
BYTE* values;
int noValues = 0;
long valuesStart = 0;
int bytesPerValue = 1;
long dataSize = 0; // Generally bytesPerValue x noValues
bool usesOffset = false;
/*If no bytes used by values is 4 or less, last 4 bytes in field hold actual values,
otherwise they point to location elsewhere in file */
BYTE fieldData[12]; // Holds IFD field
Metadata(BYTE* data, int ifd){
ifdNo = ifd;
tagID = data[0] * 256 + data[1];
tagIDHex = intToHex(tagID);
for (int b = 0; b < 12; b++){
fieldData[b] = data[b];
}
noValues = (int)((fieldData[4] * std::pow(256, 3) + fieldData[5] * std::pow(256, 2) + fieldData[6] * std::pow(256, 1)
+ fieldData[7] * std::pow(256, 0)));
// Look up datatype size based on TIFF spec where 1= BYTE, etc.
bytesPerValue = getBytesPerValue(fieldData[3]);
dataSize = noValues*bytesPerValue;
usesOffset = (dataSize>4);
if (usesOffset){
values = new BYTE[noValues]; // will get populated later
}
}
};
The following piece of code loops through the fields held in the EXIF IFD and adds each one onto a vector named existingMetadataExif.
for (int f = 0; f < exifFields; f++){
long tagAddress = exifStart + 2 + f * 12;
Metadata m = Metadata(&file[tagAddress], 1);
if (m.usesOffset){
m.valuesStart = (int)(tiffStart + (m.fieldData[8] * std::pow(256, 3) + m.fieldData[9] * std::pow(256, 2) + m.fieldData[10] * std::pow(256, 1) + m.fieldData[11] * std::pow(256, 0)));
for (int d = 0; d < (m.noValues*m.bytesPerValue); d++){
m.values[d] = file[m.valuesStart + d];
}
}
if (existingMetadataExif.size() >27){
bool debug = true;
}
existingMetadataExif.push_back(m);
}
The code works fine for some files but I'm running into a memory problem with others. The problem seems to be tied into reallocation of the vector. All files work fine up to 28 elements. This seems to be the default reserved capacity for the vector. As each element is added up to 28, the size and capacity increment by one - 0/0, 1/1, 2/2, etc. When the size gets to 29, the capacity of the vector increases to 42 i.e. a 50% increase in the original capacity.
Whilst the error is always around the 28th/29th element, it is not totally consistent. one file increases the vector capacity to 42 and immediately crashes with a "triggered a breakpoint" exception, another file triggers the crash as soon as it hits the 28th element.
I have tried existingMetadataExif.reserve(42) inside the code but it makes no difference.
Whilst it seems that the size reallocation is the trigger point, I'm also wondering about the
values = new BYTE[noValues]
line inside the struct. This is needed because each Metadata can contain a different number of values including none but I'm not directly deleting the arrays anywhere as they are needed until the application ends.
I'm developing in Visual Studio 2013 on Windows 8.1 but not using any MS specific code as this application will eventually be ported to iOS.
EDIT
Just to clarify - existingMetadataExif is the vector, declared elsewhere in the code and the error occurs at
existingMetadataExif.push_back(m);
The lines
if (existingMetadataExif.size() >27){
bool debug = true;
}
are irrelevant and can be ignored, I only out them in to help in my own debugging attempts.