0

Ok, this is a pretty weird problem in my opinion. I have a controller endpoint that looks like this:

    [Route("CreateBox")]
    [HttpPost]
    public async Task<IActionResult> CreateBox([FromBody] CreateBoxCommand command)
    {
        var events = await _mediator.Send(command);
        if (events != null)
        {
            await _mediator.Publish(events);
            return Ok(events);
        }
        return BadRequest("Please check that WarehouseId is included in request");
    }

and a CreateBoxCommand that looks like this:

[DataContract]
public class CreateBoxCommand : IRequest<EventList>
{
    [DataMember]
    public int BoxType { get; set; }
    [DataMember]
    public int BoxContains { get; set; }
    [DataMember]
    public int Location { get; set; }
    [DataMember]
    public Guid WarehouseId { get; set; }

    public CreateBoxCommand(int boxType, int boxContains, int location,
        Guid warehouseId)
    {
        BoxType = boxType;
        BoxContains = boxContains;
        Location = location;
        WarehouseId = warehouseId;
    }
}

When I send over a post to the endpoint from postman with the following payload, my BoxType int is always converted from 0303 int value to 195 int value!

payload:

{
    "WarehouseId": "7922126f-fef8-4d70-b7b4-398f2067c4aa",
    "BoxType" : 0303,
    "BoxContains" : 1,
    "Location" : 0001
}

All other int values seem to be persisted in the request, but stepping through the debugger during the model binding, you can see that 0303 has been converted to 195. I have no references or hard-coding of 195 anywhere in the codebase.


I see online that this ascii table references 0303 ascii as 195 decimal and I'm guessing that this is the issue, but why on earth would it make this conversion?

Has anyone else experienced this with POSTMAN or .NET Core 2?


Edit: I'm thinking I will just structure boxtype as an Enum here eventually anyway, and maybe this will circumvent the issue, just very strange to see it at the moment.

Lamar
  • 581
  • 7
  • 22
  • 2
    Because input is a string (or a literal number) which starts with `0`. That prefix is used to express numbers in **octal**: 0303 (base 8) is 195 (base 10). – Adriano Repetti Jul 05 '18 at 16:39
  • Thanks! wow I guess I didn't think to check that. I'm surprised i've never run into this before. Maybe because I haven't done much c# dev. Is this a .Net thing? – Lamar Jul 05 '18 at 16:43
  • No, it's pretty common in a plethora of different languages (from C, to C++, C#, Java, JavaScript, Python an many many others) – Adriano Repetti Jul 05 '18 at 16:46
  • Hmm, yeah I see that from the duplicate post linked to this now. Thanks for the help. Should I delete this post if it is a duplicate? That duplicate question was not suggested when writing this one. – Lamar Jul 05 '18 at 16:51

1 Answers1

1

The precipitating 0 causes it to be interpreted as octal, which is then converted into decimal to bind to the int. If you need to include the 0 prefixes, then you'll need to send the value as a string and parse it manually to an int.

Chris Pratt
  • 232,153
  • 36
  • 385
  • 444