2

I am attempting to serialize some XML data I'm getting over a socket into some usable objects and at first I was having trouble with receiving 0x10 characters which are explicitly invalid in 1.0 and since 1.1 is not supported with .NET I was instructed that I needed to encode my specific strings as Base64.

So this is what I've done with my class for the XML string

    [XmlRoot]
public class message
{
    [XmlElement]
    public string type { get; set; }
    [XmlElement]
    public string user { get; set; }
    [XmlElement]
    public string cmd { get; set; }
    [XmlElement]
    public string host { get; set; }
    [XmlElement]
    public byte[] msg { get; set; }

    public string GetCommand()
    {
        return System.Text.Encoding.UTF8.GetString(msg);
    }
}

I read here: XmlSerializer , base64 encode a String member that I can set the property as a byte[] and it will automatically encode as Base64, so this is what I did. I then added a method to retrieve this Base64 as a string that is human readable and I can use, hoping to side-step the issues with serialization.

However, I am getting the sane error when attempting to serialize the xml string((0x10) is where the offending characters are but they won't show on this post):

XML STRING

<?xml version=\"1.0\"?><message><type>SERVER</type><user>TestDeleteOrKillMe</user>
<cmd>PRIVATE_MSG</cmd><host>65.255.81.81</host><msg>57(0x10)(0x10)</msg></message>

ERROR RECEIVED

 {"'', hexadecimal value 0x10, is an invalid character. Line 1, position 135."}

So essentially this approach netted me the same response and I'm having trouble understanding why, could someone please point me to a demonstration code or information on why this is happening?

Community
  • 1
  • 1

0 Answers0