While parsing tokens , my all token pointers become bad and when i check in debugger it says expression can not evaluated. Error:CX0030 :Error: expression can not be evaluated and the token has value 0x00000000
I am missing something here, i tried enough but could not correct it. i know its something to do with pointer declration like const etc, but making changes does not help.
Windows + visual studio 2010 and files being parsed is unicode UTF16
code snippet ------
const wchar_t* const DELIMITER = L"\"";
wchar_t buf[MAX_CHARS_PER_LINE];
fin.getline(buf, MAX_CHARS_PER_LINE);
wchar_t* token[MAX_TOKENS_PER_LINE] = {};
token[0] = wcstok(buf, DELIMITER);
if (token[0]) // zero if line is blank
{
int n = 0;
for (n = 0; n < MAX_TOKENS_PER_LINE; n++)
{
token[n] = wcstok(0, DELIMITER); // subsequent tokens --> error ,pointer becomes bad
if (!token[n]) break; // no more tokens -------------> code does not go beyond this
buf has entire file (this is another problem that when i try to read line it gets entire file in it), can someone help me point what mistake i did here?