I am trying to get a self-made object into a set. I have got the following code:
primefactors.insert(MultiplyOfPrimeNumber(nextPrimeNumber.GetNumber(),1));
Where primefactors
is a set and MultiplyOfPrimeNumber
is an object which looks like this:
#pragma once
class MultiplyOfPrimeNumber
{
public:
MultiplyOfPrimeNumber(void);
MultiplyOfPrimeNumber(int ,int);
MultiplyOfPrimeNumber(const MultiplyOfPrimeNumber&);
int PrimeNumber() const;
int Power() const;
void AddPower(int);
void ChangePrimeNumber(int);
~MultiplyOfPrimeNumber(void);
private:
int primenumber;
int power;
};
bool operator<(const MultiplyOfPrimeNumber&,const MultiplyOfPrimeNumber&);
And the implementation of the operator overload:
bool operator<(const MultiplyOfPrimeNumber& left,const MultiplyOfPrimeNumber& right)
{
if(left.PrimeNumber()<right.PrimeNumber())
{
return true;
}
else
{
return false;
}
}
This code doesn't give a compile error, but when I insert the object into the set, the two integers which my object contains become -842150451. (the maximum value of a integer?) Why does -842150451 become assigned to my integers?