1

So, I've been searching around about the ideal way to build a cross-platform .NET (C#) application that also relies on some native code bits.

Since I couldn't find a de-facto way to build these things together, I turned by attention into the following workflow:

  • Build native code (c++ -> native "dll"(.dll/.dylib/.so/etc);
  • (Generate bindings, w/ SWIG or something, or have a project with them);
  • Package said project into NuGet;
  • Consume from the cross-platform application.

Now, Microsoft itself has some suggestions about this. But this - and all other - guides I've found so far have the same quirk: include pre-compiled binaries into the folder structure. This bothers me because:

  • It makes version control hard, because you have hard-copied build artifacts;
  • It completely breaks CI/CD because you can't just have a matrix of CI machines each one doing their build/package/upload.
  • It breaks the development workflow because you need to constantly be building + copying files over.

But, looking around in the NuGet Gallery, there seem to be packages more in the direction that I thought of, for example in the SkiaSharp, and Avalonia packages.

My idea is that there should be a series of for-one-platform-only packages (plus eventually an "aggregator" package) that could be transparently consumed.

Is this a possible workflow? If yes, how? If not, what is the currently agreed upon workflow, and how does it tackle the issues I mentioned?

Just to make it clear, I want to develop the native library and consume it in the cross-platform .NET (5/6) application in a simultaneous fashion, it's not that I'm binding a mature library, with periodic releases, that would make the mentioned workflow easy.

Thank you.

Ðаn
  • 10,934
  • 11
  • 59
  • 95
Rui Oliveira
  • 606
  • 8
  • 20
  • I don't follow your concerns. "All" you need to do is to create a nuget package which contains particular binaries in particular locations. This is how managed dlls are packaged into nuget packages already: there's no significant difference here. Why does this suddenly mean that CI/CD is "completely broken"? – canton7 Jul 27 '21 at 14:13
  • Perhaps the phrase "the folder structure" is the hang-up? That normally means the folder structure of the nuget package: I don't think anyone's suggesting that you track the native binaries in version control for instance – canton7 Jul 27 '21 at 14:14
  • @canton7, well, thing is, if for each build I want a .dll, a .dylib, a .so, copy all those into some place, and package that, how do you that with a CI system? You'd need to have the different builds copy artefacts to somewhere else, and package in the end? That's what I don't get. – Rui Oliveira Jul 27 '21 at 14:16
  • 1
    Sure, it's normal to have the CI system save an artifact from one build step, and make it available to later build steps. Multiple build steps would be somewhat pointless without this. So you can have build steps on various OSs to generate the dll, dylib and so, and a 'package' build step which pulls all of those together into the nuget package – canton7 Jul 27 '21 at 14:19
  • Hum, thanks for bringing that up to me. Maybe I was overstating in my head how hard or "correct" that was. I'll look into it, thank you. If you don't mind, I won't close for now, to be open for some extra input. – Rui Oliveira Jul 27 '21 at 14:21
  • 1
    E.g., [how to do it in GitLab](https://stackoverflow.com/questions/38140996/how-can-i-pass-artifacts-to-another-stage), [TeamCity](https://www.jetbrains.com/help/teamcity/artifact-dependencies.html#Configuring+Artifact+Dependencies+Using+Web+UI), and [GitHub Actions](https://docs.github.com/en/actions/guides/storing-workflow-data-as-artifacts) – canton7 Jul 27 '21 at 14:24
  • 1
    That said, there are some packages which produce a nuget package per platform, e.g. [OpenCvSharp4](https://github.com/shimat/opencvsharp) – canton7 Jul 27 '21 at 14:27

0 Answers0