Initialize your variables and you will see what is happening. It isn't ignore the decimal. It is causing an error that stops the parsing. So the crazy number you see is actually the value of the uninitialized integer.
Here is what is happening:
When you type "10 2.5" it puts 10 into a, and 2 into b. It does not ignore th e 0.5. To understand what actually happens, try this code:
int a=100 , b=200 , c=300, sum;
cin>>a>>b>>c;
cout<<a<<endl;
cout<<b<<endl;
cout<<c<<endl;
Then enter in "10 2.5" and a will be 10, b will be 2, and c will be 300! The ".5" caused cin to get an error, and so it just left c at the default value. But since you only read 2 values, it seemed to work just fine. So try that version with your second set of inputs "2.5 10". A will be 2, then b will be 200 and c will be 300. That shows how cin encountered an error when it saw the decimal point, and just gave up.
And finally for fun, remove the initializations in my example, and watch how you get crazy values for b and c.