12

I've created 3 proto files and would like to keep it in a git repo:

enter image description here

separated from all others files. The repository contains only .proto files. I have 3 microservices and each of them has their own repository that is using those proto files to communicate with each others:

enter image description here

You can see on the picture above, that proto files are consuming from different microservices.

Assume, I am going to change the Protofile2 and push the changes to proto repository, remember proto files repository are separated from microservices repository:

enter image description here

When I run go test on service1 or service2, it should tell me, that Protofile2 has changed or does not have the same hash like proto file in the service2 folder:
enter image description here

That I have to generate the code again.

Does it exist any tools to solve the problem? Or how should I solve it?

softshipper
  • 32,463
  • 51
  • 192
  • 400

2 Answers2

7

Here's what I suggest:

  • Store your protos (and their go generating makefiles) in a single git repo. Each definition should be in their own directory for import simplicity
  • tag the repo with a version - especially on potentially breaking changes
  • import a particular proto defs from your micro-services e.g. import "github.com/me/myproto/protodef2"
  • use go modules (introduced with go v1.11 in 2019) to ensure micro-service X gets a compatible version of protobuf Y

To point 2 - and as @PaulHankin mentioned - try not to break backward compatibility. Protobuf fields can be removed, but as long as the remaining field indices are unaltered, old client calls will still be compatible with newer proto defs.

colm.anseo
  • 19,337
  • 4
  • 43
  • 52
  • Hey this is great, how on earth does that import work? I’m trying to find any type of documentation and can’t. Is such an import in the proto file? I found https://jbrandhorst.com/post/go-protobuf-tips/ where he talks about using the go_package option. Any pointers helpful I’m new to this. – Yehuda Makarov May 22 '20 at 22:00
  • 1
    Wait, do you mean that the repo has proto files and ALSO the generated .pb.go files? And then a service in a separate repo and import the right version of a generated pb.go file? – Yehuda Makarov May 24 '20 at 04:34
  • 1
    @YehudaMakarov yes. Once a service matures - the `.proto` file will rarely change - and a go service (to compile) relies on the `*.pb.go` files, so it's not necessary for any CI/CD process to include the protobuf compilation step. And maintaining the `proto` & `.pb.go` in a single repo with a `go.mod` allows client/server code to pull in the precise release version they need (via the semver git release tags). – colm.anseo May 24 '20 at 13:07
  • 1
    Any sugessions for python? – A l w a y s S u n n y Apr 04 '22 at 18:40
1

Usually one tries to make protocol buffers backwards compatible, so that services that depend on a proto file don't necessarily need to be changed when the proto file changes. See https://developers.google.com/protocol-buffers/docs/proto3#updating

However, if you want to check, you can write a test using proto.GetProperties(msgType). Put the expected struct properties in a literal, and use reflect.DeepEqual to compare it to the dynamic struct properties you get from calling proto.GetProperties on the dynamic type of your proto. Something like this:

func TestMyProtoStructVersion(t *testing.T) {
    gotProps := proto.GetProperties(reflect.TypeOf(&mypb.MyProtoStruct{}))
    if !reflect.DeepEqual(gotProps, wantMyProtoStructProps) {
        t.Errorf("MyProtoStruct proto has changed")
    }
}

You could use go generate to automate the process of creating a file containing the expected struct properties (wantMyProtoStructProps) of your protocol buffer that you can include in the test.

Paul Hankin
  • 54,811
  • 11
  • 92
  • 118