1

I am experiencing a few problems with Crypto++'s Integer class. I am using the latest release, 5.6.2.

I'm attempting to convert Integer to string with the following code:

CryptoPP::Integer i("12345678900987654321");

std::ostrstream oss;
oss << i;
std::string s(oss.str());
LOGDEBUG(oss.str()); // Pumps log to console and log file

The output appears to have extra garbage data:

12345678900987654321.ÍÍÍÍÍÍÍÍÍÍÍýýýý««««««««îþîþ

I get the same thing when I output directly to the console:

std::cout << "Dec: " << i << std::endl; // Same result

Additionally, I cannot get precision or scientific notation working. The following will output the same results:

std::cout.precision(5); // Does nothing with CryptoPP::Integer
std::cout << "Dec: " << std::setprecision(1) << std::dec << i << std::endl;
std::cout << "Sci: " << std::setprecision(5) << std::scientific << i << std::endl;

On top of all of this, sufficiently large numbers breaks the entire thing.

CryptoPP::Integer i("12345");

// Calculate i^16
for (int x = 0; x < 16; x++)
{
    i *= i;
}

std::cout  << i << std::endl; // Will never finish

Ultimately I'm trying to get something where I can work with large Integer numbers, and can output a string in scientific notation. I have no problems with extracting the Integer library or modifying it as necessary, but I would prefer working with stable code.

Am I doing something wrong, or is there a way that I can get this working correctly?

jww
  • 97,681
  • 90
  • 411
  • 885
  • 1
    Show the definition of `LOGDEBUG`. Sounds like you're calling `printf` and passing the string object returned from `ostrstream` instead of a pointer to the text buffer. Since no call has been made to force the buffer to be null terminated it's possible you'll see garbage appear after the text. Try `LOGDEBUG(oss.str().c_str())` – Captain Obvlious Apr 27 '15 at 17:23
  • Quite frankly, LOGDEBUG is not the problem here. As specified in my post, std::cout produces the same output. I could completely remove my Urho3D stuff (including LOGDEBUG) and std::cout will still produce the same problem. –  Apr 27 '15 at 17:29
  • *"On top of all of this, sufficiently large numbers breaks the entire thing"* - yes, that's documented behavior. The exact size is *absolute value less than `(256**sizeof(word)) ** (256**sizeof(int))`*. See the comments in the [Integer header file](http://www.cryptopp.com/docs/ref/integer_8h_source.html). That's a awful big number, even for cryptography. Perhaps you should move to a library that provides a general purpose integer, like [GNU's Multiprecision Integer](https://gmplib.org/) (gmp). – jww Apr 28 '15 at 21:20
  • 1
    *"Additionally, I cannot get precision or scientific notation working"* - yeah, `Integer` class does not respond to them. There is a patch for `std::showbases` and suffixes (such as the trailing dot with decimal output) at [Integer patch](http://www.cryptopp.com/wiki/Integer_Patch). – jww Apr 28 '15 at 21:33
  • 1
    @Thebluefish "I could completely remove my Urho3D stuff (including LOGDEBUG) and std::cout will still produce the same problem", well, **do that then**. You're supposed to post [Minimal](http://stackoverflow.com/help/mcve) code to reproduce the problem anyway. – M.M May 06 '15 at 04:37
  • Consider that I did. After that statement I elaborate on the fact that I got the same issue without it. I put that code there to show what I was originally trying to do (manipulate the string outside of std) in case that mattered in the solution. There's no harm in putting in more detail if it will help, and I certainly wasn't just dumping code. I am well aware of the rules and formatting. –  May 06 '15 at 05:53

2 Answers2

2

I'm attempting to convert Integer to string with the following code:

CryptoPP::Integer i("12345678900987654321");

std::ostrstream oss;
oss << i;
std::string s(oss.str());
LOGDEBUG(oss.str()); // Pumps log to console and log file

The output appears to have extra garbage data:

12345678900987654321.ÍÍÍÍÍÍÍÍÍÍÍýýýý««««««««îþîþ

I can't reproduce this with Crypto++ 5.6.2 on Visual Studio 2010. The corrupted output is likely the result of some other issue, not a bug in Crypto++. If you haven't done so already, I'd suggest trying to reproduce this in a minimal program just using CryptoPP::Integer and std::cout, and none of your other application code, to eliminate all other possible problems. If it's not working in a trivial stand-alone test (which would be surprising), there could be problems with the way the library was built (e.g. maybe it was built with a different C++ runtime or compiler version from what your application is using). If your stand-alone test passes, you can add in other string operations, logging code etc. until you find the culprit.

I do notice though that you're using std::ostrstream which is deprecated. You may want to use std::ostringstream instead. This Stack Overflow answer to the question "Why was std::strstream deprecated?" may be of interest, and it may even the case that the issues mentioned in that answer are causing your problems here.

Additionally, I cannot get precision or scientific notation working. The following will output the same results:

std::cout.precision(5); // Does nothing with CryptoPP::Integer
std::cout << "Dec: " << std::setprecision(1) << std::dec << i << std::endl;
std::cout << "Sci: " << std::setprecision(5) << std::scientific << i << std::endl;

std::setprecision and std::scientific modify floating-point input/output. So, with regular integer types in C++ like int or long long this wouldn't work either (but I can see that especially with arbitrary-length integers like CryptoPP:Integer being able to output in scientific notation with a specified precision would make sense).

Even if C++ didn't define it like this, Crypto++'s implementation would still need to heed those flags. From looking at the Crypto++ implementation of std::ostream& operator<<(std::ostream& out, const Integer &a), I can see that the only iostream flags it recognizes are std::ios::oct and std::ios::hex (for octal and hex format numbers respectively).

If you want scientific notation, you'll have to format the output yourself (or use a different library).

On top of all of this, sufficiently large numbers breaks the entire thing.

CryptoPP::Integer i("12345");

// Calculate i^16
for (int x = 0; x < 16; x++)
{
    i *= i;
}

std::cout  << i << std::endl; // Will never finish

That will actually calculate i^(2^16) = i^65536, not i^16, because on each loop you're multiplying i with its new intermediate value, not with its original value. The actual result with this code would be 268,140 digits long, so I expect it's just taking Crypto++ a long time to produce that output.

Here is the code adjusted to produce the correct result:

CryptoPP::Integer i("12345");
CryptoPP::Integer i_to_16(1);

// Calculate i^16
for (int x = 0; x < 16; x++)
{
    i_to_16 *= i;
}

std::cout << i_to_16 << std::endl;
Community
  • 1
  • 1
softwariness
  • 4,022
  • 4
  • 33
  • 41
  • *"The corrupted output is likely the result of some other issue, not a bug in Crypto++. "* +1. `LOGDEBUG` was my first suspect until he stated he could reproduce with `cout`. – jww Apr 28 '15 at 21:24
  • This is very good information, thank you! I completely overlooked that large number issue, you were right in that I wanted to take it to the power of 16. I will test these tonight and try to isolate the issue further. Any ideas on the precision/scientific notation? I cannot seem to find any documentation on whether this is support or where it would need to be implemented. –  Apr 28 '15 at 21:28
  • @Thebluefish Updated answer to address the precision/scientific issue, but in short it won't work. – softwariness Apr 28 '15 at 22:00
  • 1
    Aha, yes the issue was with ostrstream. As soon as I replaced it with ostringstream and recompiled, the output appears exactly as I expected it. I was also able to modify the Integer library to handle std::scientific. Thanks very much for your detailed answer :) –  May 08 '15 at 18:37
1
LOGDEBUG(oss.str()); // Pumps log to console and log file

The output appears to have extra garbage data:

12345678900987654321.ÍÍÍÍÍÍÍÍÍÍÍýýýý««««««««îþîþ

I suspect what you presented is slighty simplified from what you are doing in real life. I believe the problem is related to LOGDEBUG and the ostringstream. And I believe you are outputting char*'s, and not string's (though we have not seen the code for your loggers).

The std::string returned from oss.str() is temporary. So this:

LOGDEBUG(oss.str());

Is slighty different than this:

string t(oss.str());
LOGDEBUG(t);

You should always make a copy of the string in an ostringstream when you intend to use it. Or ensure the use is contained in one statement.

The best way I've found is to have:

// Note: reference, and the char* is used in one statement
void LOGDEBUG(const ostringstream& oss) {
    cout << oss.str().c_str() << endl;
}

Or

// Note: copy of the string below
void LOGDEBUG(string str) {
    cout << str.c_str() << endl;
}

You can't even do this (this one bit me in production):

const char* msg = oss.str().c_str();
cout << msg << endl;

You can't do it because the string returned from oss.str() is temporary. So the char* is junk after the statement executes.

Here's how you fix it:

const string t(oss.str());
const char* msg = t.c_str();
cout << msg << endl;

If you run Valgrind on your program, then you will probably get what should seem to be unexplained findings related to your use of ostringstream and strings.

Here is a similar logging problem: stringstream temporary ostream return problem. Also see Turning temporary stringstream to c_str() in single statement. And here was the one I experienced: Memory Error with std:ostringstream and -std=c++11?


As Matt pointed out in the comment below, you should be using an ostringstream, and not an ostrstream. ostrstream has been deprecated since C++98, and you should have gotten a warning when using it.

So use this instead:

#include <sstream>
...

std::ostringstream oss;
...

But I believe the root of the problem is the way you are using the std::string in the LOGDEBUG function or macro.


Your other questions related to Integer were handled in Softwariness's answer and related comments. So I won't rehash them again.

Community
  • 1
  • 1
jww
  • 97,681
  • 90
  • 411
  • 885
  • 1
    `oss.str()` and `string(oss.str())` should behave the same – M.M May 06 '15 at 04:36
  • Thanks @Matt. I was not sure if that would make a difference. I've actually seen GCC produce bad code with Crypto++. The bad code related to the scheduling of the destructors. There was more to it, and the fix was to ensure you did *not* use anonymous declarations. – jww May 06 '15 at 04:40
  • 1
    actually I take that back: OP is using `ostrstream`, not `ostringstream`, so there might be a difference. – M.M May 06 '15 at 04:41
  • 1
    Aha, yes the issue was with ```ostrstream```. As soon as I replaced it with ```ostringstream``` and recompiled, the output appears exactly as I expected it. –  May 08 '15 at 18:35
  • @Thebluefish - Very good. By the way, when you misuse an `ostringstream` (like holding onto the `char*`), that's exactly one of the symptoms you see in practice. – jww May 08 '15 at 19:51
  • Thanks for letting me know. I'm doing a copy before it gets initialized, so that's no worries. Cheers! –  May 08 '15 at 19:53
  • @Thebluefish - related to `setprecision`: if you provide requirements, then I can take a look at incorporating it in the [Integer Patch](http://www.cryptopp.com/wiki/Integer_Patch). I don't think I've ever used `setprecision`, so I don't know the expected behavior. And I don't know the reasonable way to handle large integers (like 128-bit or 256-bit or above) versus a 16-bit, 32-bit or 64-bit integer. – jww May 11 '15 at 17:02
  • @jww I've technically got it working to what I need it. I use it solely with ```std::ios::scientific```, and it will produce results such as ```1.2345e20``` for a precision of 6, or ```1.2e24``` for a precision of 2. –  May 11 '15 at 17:24