1

Usually when I'm doing Network based work I am using Protobuf to move custom objects around the network that are modelled for the system that is being built to move my data/composed objects from other systems.

I am currently involved in enhancing a project that is using a proprietary text based protocol that predominately uses text for serialisation (, | and [] notations) of data and Entityspaces as the Data Access layer.

The question I'm asking is should I create another layer of objects that provide the Protobuf network objects that are populated from the "load" of Data from the Entityspaces objects or should I add the necessary protobuf tags to the objects in the Entityspaces objects (they get autogenerated from the database and currently don't have that)

leppie
  • 115,091
  • 17
  • 196
  • 297
Paul Farry
  • 4,730
  • 2
  • 35
  • 61

1 Answers1

0

Could go either way. If you need to to encode protobuf data into a text format: use base-64, which conveniently doesn't use | / [ / ].

If you prefer to protobuf-encode objects that aren't currently attributed, there are ways to do that too: if the types from the database are generated as partial classes, then at the simplest you can just do (in another code file):

namespace The.Same.Namespace {
    [ProtoContract]
    [ProtoPartialMember(1, "Foo")]
    [ProtoPartialMember(2, "Bar")]
    ...
    partial class SomeEntity {}
}

where Foo and Bar are members you want serialized. Or alternatively, you can configure the entire model at runtime:

RuntimeTypeModel.Default.Add(typeof(SomeEntity), false).Add("Foo", "Bar");

(that is a very basic configuration; much more subtle options are available)

Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900
  • Reviewed this some more over the weekend. There a some direct entity transfer over the network, but there is additional and different data as well, so will probably just use custom standard Protobuf tagged classes. Thanks – Paul Farry Dec 16 '12 at 23:02