4

We have huge amount of libraries in our application. Libraries are written either in c++ or c#. (Platform: .net framework, windows, 64 bit) Compiling everything as source code takes a lot of time. We were thinking about switching to prebuilt binaries, but we still would like to leave a possibility to return back to source code. As a version control system we use git, builds can be done by azure devops. Can set up any custom server as seen necessary.

What kind ready made tools there exists for package management, and possibility to easy switch between source code and pre-built binaries? (If tools are not compatible with both programming languages - its ok to specify toolset only for one language.) If such tools does not exists, what you would recommend by yourself - what kind of packaging to use, what kind of scripts to write on top of that one ?

Is it possible to identify Api/abi breaks ?

TarmoPikaro
  • 4,723
  • 2
  • 50
  • 62
  • 2
    I really don't understand this question. Can you please try asking it in another way? Thanks. – Jonathan Apr 27 '20 at 21:50
  • 1
    First, C++ and C# are very different languages and have hugely different mechanisms for compilation and for packaging. They also have fundamental differences on what _"API/ABI"_ means. Second, can you try explaining what you are looking for more explicitly, perhaps showing what you have and what you want. You appear to be asking _"Want to know how source code/binary switch is performed"_. In both languages, _source code_ is _switched_ into _binary_ through compilation. I'm sure you are asking something different, but it's very hard to tell – Flydog57 Apr 27 '20 at 22:03
  • just guessing, are trying to ask how to use Source Control ? such as git, github ..etc. so you'll have a source control where other people can contribute and you control the source and compiled versions ? – iSR5 Apr 27 '20 at 22:14
  • Binarys are dependent on the Operating System and the platform (processor). A binary for an ARM 11 processor won't run on a platform using an Intel x86 processor. Likewise, a binary compiled for a Linux system on an ARM 11 platform will not run on a Windows 10 system on an ARM 11 platform. The idea behind issue source code is so that people can compile it for their platform otherwise you need to issue libraries for every combination or permutation of operating system and processor. – Thomas Matthews Apr 27 '20 at 23:22
  • There may be a difference between a binary and an executable. In the embedded systems world, a *binary* is usually a snapshot of the processor code that is executed in a specific area of memory (no transformation). Whereas an executable (such as an ELF format), is more relative and designed to be loaded anywhere in memory with few translations or transformations. – Thomas Matthews Apr 27 '20 at 23:25
  • Rephrased question fully – TarmoPikaro Apr 28 '20 at 20:02
  • some topics you may want to look into (for c#, I dont know about c++): - nuget - Microsoft's Managed Extensibility Framework (MEF) - inversion of control (IOC). the question about "switching" between source code and binary is quite strange. – Welcor May 01 '20 at 00:00
  • what do you mean by "switch between source code and pre-built binaries"? – bolov May 01 '20 at 03:44
  • What build system are you using? I suspect this question would get better answers if it targeted a specific build system, like CMake. – NicholasM May 01 '20 at 05:08
  • Maybe author says about build / rebuild ? If you change code in library build target (for instance in msbuild) will rebuild library from sources, otherwise it will reuse existent object files, binaries. If you build is configured to always clean then build then you can consider avoid clean target. Using "Build" you achieve your goal. – Vlad May 01 '20 at 19:17
  • 2
    This question is asking for a recommendation for a tool, which is one of the valid reasons to close a question. – Wyck May 06 '20 at 19:39
  • I guess I don't want to exclude any alternatives, but my idea is that if component is ABI compatible, then no build is needed, but if it's only API compatible, but not ABI compatible - then you would need rebuild. Ideally I'm searching for build framework, but suspect there is no ready made, that's why proposal on set of tools is acceptable as well. – TarmoPikaro May 10 '20 at 06:26

7 Answers7

3

What kind ready made tools there exists for package management, and possibility to easy switch between source code and pre-built binaries?

In Visual Studio, you can add a reference to another project (source code) or another library (dll), but you cannot add a reference to a project from a different solution.

If you want to switch between source and dll, then you have to keep adding/removing the project from your solution... this is not a good software architecture... it suggests that these components are tightly coupled, and you require the source code of one component in order to test another. Software components should be independent of each other.


Compiling everything as source code takes a lot of time. We were thinking about switching to prebuilt binaries, but we still would like to leave a possibility to return back to source code

It sounds like you have a large scale application and you want to reference a portion of your own code as a pre-compiled component, to reduce the build time. In otherwords, you want to build your own in-house libraries. Now, there is nothing special about external libraries. You can easily create a nuget packages for any of your own Class Libraries.

As an example, you can have 2 Solutions:

Your Solution-1 contains your class library projects. You can create a nuget package for each project in this solution. (Note that a nuget package is a .nupkg file)

Your Solution-2 has your remaining projects and you can install the above packages here.

If you ever need to change your library, you can modify it in Solution-1 and create a new version... of course you would need to update the package in Solution-2.

Note: you don't need to have 2 separate solutions in order to create a Nuget package... you can build a nuget package for any class library

Publishing the package

If you want everyone to access your package then you would need to Publish it, otherwise you can put your package on a shared drive which is accessible within your organisation.

Breaking Changes

Let's say you have made your own nuget package: my-inhouse-lib and you want to make a breaking change to it:

  1. Open my-inhouse-lib source code, make the change, build and create a new nuget package, this would be: my-inhouse-lib-ver-2

  2. Your other project have dependency on this package, install my-inhouse-lib-ver-2 on this project (this is your Dev environment)

  3. Your Dev environment will break, because of the new breaking change... fix the Dev environment

  4. Release your fixed code into Test/Prod environment...

There is nothing special about this... it's just like updating any other nuget package

Hooman Bahreini
  • 14,480
  • 11
  • 70
  • 137
  • Does nuget packaging works also for C++ code as well ? Can nuget package file generation be automated, for example if we have source code as separate git, can be use sparse git to get nuget package specification from that git ? – TarmoPikaro Apr 30 '20 at 09:15
  • How api breaks can be handled with nuget packaging ? E.g. you break API (requires recompilation) or ABI (can work as it's without compilation) - how you can configure this from nuget packaging such breaks ? – TarmoPikaro Apr 30 '20 at 09:20
  • Creating a nuget package is independent of your programming language, so long as you can open your application in Visual Studio and Build it, you can create a nuget package (please check the instructions in the link) – Hooman Bahreini Apr 30 '20 at 09:23
  • I don't really follow your question... generating a nuget package has nothing to do with your source control and git... you need to build your project and use `nuget pack` command from Command Prompt to get `.nuget` file... then install it where you need the package... just like installing any other library. – Hooman Bahreini Apr 30 '20 at 09:29
  • It's also possible to use nuget packaging for C++, see this link: https://digitalhouseblog.wordpress.com/2019/08/22/how-to-make-a-nuget-package-for-c/ But indeed that does not take side on API / ABI selection. – TarmoPikaro Apr 30 '20 at 09:34
  • Sorry I don't follow... what is API Break? – Hooman Bahreini Apr 30 '20 at 09:37
  • API - application interface, breaking means making interface non-compatible with previous version. See wiki pages: https://en.wikipedia.org/wiki/Binary-code_compatibility https://en.wikipedia.org/wiki/Application_binary_interface. API compatible means that interface is the same, but you need to compile your client code still. ABI compatible means that you can just replace old DLL with new DLL and it will work out of box. Normal break means that you add any new parameter and code will not work after that (requires client changes) – TarmoPikaro Apr 30 '20 at 09:52
  • Please note that if you change the source code for your library, the other projects using this library will not break (unless you install the new version of nuget package)... which means you can modify the source code of your library without having to worry about breaking other program. – Hooman Bahreini Apr 30 '20 at 10:09
  • To avoid thinking about API / ABI in C++ it's worth to design API in COM-like style - using in API only interfaces (virtual functions). And passing only buildin or pointer to interfaces as parameters. Or you can use C functions as API, or opaque pointers. In short - your API <=> ABI. Otherwise if you pass std::string in dll method and change run-time version only in dll (or only in exe) then you have problems. If you change run-time everywhere then you shoul recompile even if API is intact. – Vlad May 01 '20 at 20:05
  • In the managed world of .Net / C#, I think an API break is exactly equal to an ABI break? – Chris F Carroll May 11 '20 at 11:39
1

I read your question as saying that dependencies are in .Net or C++ ; and your final output builds are .Net. You are aware that NuGet is 'the' standard .Net tool for all things dependency related.

Now. It depends on what exactly you want with your thought of easily 'switching' between source and binary?

You don't distinguish on the build server from on the developer machine. You surely want both to build from binaries so as to be as fast as possible, but I can't make sense of wanting source code on the build server: source code is for humans, not machines.

Option 1. You mean 'get the build speed of binary but the debugging experience of source'

You can achieve this with NuGet. You are using Azure so you can add tasks to your library builds to publish straight to a private Azure NuGet feed. Your starting point for that is

Which talks you through

  1. Set up the feed
  2. Publish to the feed
  3. Consume the feed

Nuget can give you the full “source code when debugging” experience because Nuget publish can include source symbols. This is covered on the same page at the anchor,

On the developer machine, replace dependencies on projects in the solution with dependencies on the NuGet package. You then get the normal nuget experience of:

  • NuGet caches compiled .dlls from the feed on your machine
  • NuGet will tell you when updates are available
  • You choose when to get latest ; OR you can enforce always-update-on-build ; OR you can enforce always-update-on-checkout.

Force update-on-checkout will give you the source-code-style experience of always looking at the (nearly) latest code but still have the fastest possibly build from binary. You can force update-on-git-checkout by adding a git post-checkout hook to run nuget update, so that every git get latest will simultaneously get-latest from nuget. https://ddg.gg/example%20git%20oncheckout%20hook%20windows.

( It may help to enforce a standard checkout path for libraries on developer machines, because .Net symbol files will include the path used for the last build. However this takes some effort and windows-know-how to reproduce paths on dev machines used by the build server. )

Option 2. You mean 'be able to easily switch between building from source and building from binary'

There is no out-of-the-box solution for this because it is an unusual requirement. The 'simple' approach is to have separate projects and solutions. Instead, ask again what exactly are you trying to achieve?

If you just want to be sure of building from very latest sources, then Option 1 already resolves this -- give or take a few seconds -- for both build server and dev machines, because your Azure nuget feeds will always contain the very latest builds.

( In fact, if you use automated tests, Option 1 is better than build from source, because it used the 'most recent source that builds and passes tests' rather than 'most recent thing someone accidentally checked in'. )

If you want 'full source in my project but fast builds', this can happen automatically: msbuild, like cmake, will not rebuild a project if it's already up-to-date. However, visual studio developers habitually remember the keystroke for rebuild (= clean + build ) instead of build. If you have a ton of dependencies, learning to change this one keystroke can save you all the waiting time. The catch is, if other developers are making lots of changes there is no avoiding the get-latest-source-and-rebuild time.

You could have different project files for building dev machine vs build server. Maintaining that will be an error-prone drag, so you should script the auto-generation of the build server csproj files from the standard ones. The build server can then get dependencies from nuget, whilst the dev machine uses source and also pushes updates of successfully built libraries to the nuget feed.

Comment : How close can we get to best of both worlds?

There is no avoiding the fact that having large projects with many dependencies introduces a cost that you don't have with small projects. You pay the price either in having a large projects which are slow to open and slow to build; or in breaking up the builds (via nuget or some other means) and losing the instant simple access to all the source code.

NuGet feeds with symbols offers the most-mainstream, and simplest, attempt to solve this.

Chris F Carroll
  • 11,146
  • 3
  • 53
  • 61
0

We were thinking about switching to prebuilt binaries, but we still would like to leave a possibility to return back to source code

For C++ you can use Conan (I'm sure there are other similar tools as well but I have not used them). You can upload your pre-built packages to a server, and have your developers install them when needed. It would be as easy as adding dependencies to the requirements list of a conanfile.py that you keep as part of your projects. And to build from sources (e.g. when binaries are missing from your server) you can add a --build option to the conan install command (which installs your project dependencies).

to easy switch between source code and pre-built binaries

Are you trying to switch between keeping a dependency in your project's source tree as opposed to linking with the binary? I don't see any long term benefits to having it in your source tree if you use a package manager. Sure it might be ever so slightly more convenient to modify the source code of the dependency if it's part of the source tree (by not having to switch between directories for example), but I would argue it is probably more organized and efficient in the long run to change the dependency separately in it's own project. (and in Conan you can use "users" and "channels" to manage these custom versions of third-party libraries)

Is it possible to identify Api/abi breaks

Best way to manage this in my opinion is to keep track of your dependency versions. Again, a package manager like Conan can help you with this since you specify the version of the dependency in your requirements.

A bonus of using Conan would be that you can also add build dependencies that bring in your environment. For example CMake can be a build dependency. So your package would first install the specified CMake version and then the rest of your dependencies that are built with CMake. This makes your DevOps management much easier.

pooya13
  • 2,060
  • 2
  • 23
  • 29
0

For C++ projects, you can check CMake for your purpose. You can keep different build options like 1) building the entire source code 2) Build the sub modules into libraries once and reuse it for the main project. The latest CMake supports pre-compiled headers also. You can check that also to reduce the compilation times.

Conan and vcpkg are two package managers available for C++ modules

sanoj subran
  • 401
  • 4
  • 11
  • I have tried to install conan on windows, without any luck. Have you tried conan on windows by yourself ? – TarmoPikaro May 04 '20 at 08:32
  • One thing to note is that conan does not yet supports publishing debug symbols on Microsoft symbol server - https://github.com/conan-io/conan/issues/4047 . I did not request to support that one, but this might be decision maker as well. – TarmoPikaro May 05 '20 at 10:43
  • @TarmoPikaro I haven't explored Conan much. – sanoj subran May 05 '20 at 16:04
0

We were thinking about switching to prebuilt binaries but we still would like to leave a possibility to return back to source code

so first of all you'll need a build system, which I recommend you to use cmake to handle your source code and its dependencies. Lets get a testing setup here, and let assume your working directory looks like this:

project_folder
|-- lib
|   |-- LibA.cpp
|   |-- LibA.hpp
|-- tools
|   |-- conanfile.py
|   |-- conanfile.txt
|-- CMakeLists.txt
|-- main.cpp  

what are they suppose to be: CMakeLists.txt is your cmake configuration file, to handle

Is it possible to identify Api/abi breaks ?

and conanfile.py is your recipe, means what you need to package and create your pre-built libraries and conanfile.txt is your require file for consuming your pre-built libraries.

so apparently, I also suggest to use conan to handle your pre-built libraries, or also called as packages.

Now lets get through my example and how to use it: CMakeLists.txt:

cmake_minimum_required(VERSION 3.15)

project(my_project LANGUAGES CXX)

add_library(LibA lib/LibA.cpp)

add_executable(ExecutableA main.cpp)
target_link_libraries(ExecutableA LibA)

target_include_directories(LibA PUBLIC ${CMAKE_CURRENT_LIST_DIR}/lib)

I'll stick on cpp in my example, but for your .net framework, you could check this or this

So I just define my targets in that files, and declare its dependencies with each others, with following contents: main.cpp:

#include <iostream>
#include "LibA.hpp"


int main()
{
  printTesting();
  std::cout << "Hello World! again" << std::endl;
  return 0;
}

LibA.cpp:

#include <iostream>
#include "LibA.hpp"

void printTesting(void)
{
  std::cout << "Testing Something" << std::endl;
}

LibA.hpp:

#include <iostream>

void printTesting(void);

I just assume, you are not familiar with cmake: we'll let cmake generate the Makefile for us, or for you evtl. the .vcxproj if you use .net generator, but I want to keep it simple here, so we'll configure the project and build the binaries.

mkdir ../build && cd ../build
cmake ../project_folder && make

you'll get your LibA and ExecutableA , so nothing special here yet, just that cmake now take care of your dependencies, and evtl. rebuild them if something has changed (e.g. new commits in your projects). with conanfile.py like this:

from conans import ConanFile, tools


    class LibAConan(ConanFile):
        name = "libA"
        version = "0.1"
        settings = "os", "compiler", "build_type", "arch"
        description = "<Description of LibA here>"
        url = "None"
        license = "None"
        author = "None"
        topics = None

        def package(self):
            self.copy("*.a")

        def package_info(self):
            self.cpp_info.libs = tools.collect_libs(self)

you want to:

cp ../project_folder/tools/conanfile.py && conan export-pkg . libA/0.1@myuser/testing

with our package name is: libA in 0.1 version. myuser/testing is the channel, which is interesting incase your doing cross-compilation for several target plattforms.

We created the reusable package by calling conan export-pkg, and this is cached in $HOME_DIR/.conan/data/libA/0.1/myuser/testing (or check it in you environment: conan config home)

Evtl. you want to install conan first, so check this installation guideline or this downloads and for your special setup: conan with azure devops

So those created packages could also be uploaded to any remote_server: $ conan upload libA/0.1@myuser/testing --all -r=my_remote_server (as analogy: like git push after commiting in git).

So we have configured, built with cmake and created and uploaded packages using conan, your colleagues could reused the prebuilt packages/binaries by define their requirement in pythonfile.txt:

[requires]
libA/0.1@myuser/testing

[generators]
cmake

and they could do:

mkdir ../build_using_prebuild && cd ../build_using_prebuild && cp ../project_folder/tools/conanfile.txt . && conan install -g deploy . -r my_remote_server

conan will search for that required package and evlt. download it from my_remote_server

If they now run cmake ../project_folder && make, those prebuilt packages will be used, instead of compiling the src from scratch.

Of course I recommend you to automatise those steps in azure, but I guess you got the points how the build system and package manager could be used to orchestrate your build.

dboy
  • 1,004
  • 2
  • 16
  • 24
0

There are multiple axis on this problem - source code building, installation of package, external repositories, api/abi breaks, build system itself.

First of all your requirements - are you interested only in packaging and/or also installation.

For packaging itself it's possible to use for example following packaging systems: nuget, conan, vcpkg, choco.

Most probably you will be also interested not only publishing the packages itself, but also debug symbols for them.

For azure devops / debug symbols publishing documentation can be found from following links for example:

https://azure.microsoft.com/en-us/blog/deep-dive-into-azure-artifacts/ https://learn.microsoft.com/en-us/nuget/create-packages/symbol-packages-snupkg

Conan (c++ packaging) at the moment does not support symbols publishing: https://github.com/conan-io/conan/issues/4047

To get source code distribution it's possible to use git itself. Git submodules does allow to create external git repository reference, but once you have added external submodule, you need to use double commit to change everything - one git commit to sub repository, one git commit to update referenced git hash from master repository.

There indeed exists some walkarounds to this problems like: nested git without submodules:

Nested git repositories without submodules?

=>

http://debuggable.com/posts/git-fake-submodules:4b563ee4-f3cc-4061-967e-0e48cbdd56cb

If you build however main repository with prebuilt binaries instead of source codes - in ideal solution you don't need that external git repository, meaning also that no git clone / checkout of external repository is needed.

Then it's also possible to have either all-in-one solution or multiple solutions, each compiled separately.

Easier indeed to have all-in-one solution, because it's less hassle with dependent solutions compilation.

To achieve fully dynamic solution generation, it's possible for example to use cmake - it can generate solution depending on preconfigured options used from cmake. (See cmake's "option").

cmake is good alternative indeed for C++ (supports precompiled header, unity building speed up), but maybe not so straightforward for C# (difficult to find necessary parameters how to configure it correctly).

At the moment (5.2020) haven't found any better alternative to cmake, but there are multiple tools and ecosystems evolving in parallel to cmake, so need to observe what future might bring.

Then concerning packaging. choco seems to be a subset of nuget - it extends nuspec with it's own xml tags supporting software update. (referred web pages: https://chocolatey.org/, https://learn.microsoft.com/en-us/nuget/reference/nuspec)

If you want to publish nuget package or even installer package, it's possible to use nuget repository (without any extra costs).

If insufficient for your needs, it's also possible to upgrade nuget server to choco server, but it might costs something - see chocolatey web pages.

Api/abi breaks cannot be observed on any level, until actualy compilation / link error occurs or application crashes at run-time - only alternative it's possible to do - is to control versioning of packaging itself - so master package nuget(or choco) version would require higher version of dependent nuget (or choco) version.

If main repository is developed by same guys as child repository - then api/abi breaks can be done by developer itself, but if two gits are independent from each other and developed by different teams - then api/abi breaks can occur at any point of time.

Best way to ensure that repos are consistent - is to have unit testing on of master repository, which would check that no api/abi break occurs.

Theoretically if api/abi breaks occurs - build system itself could continue with two build alternatives -

  1. follow main repository with last working child repository
  2. follow child repository without building main repository

Same mechanics could be applied by using branches, e.g.

  1. "main git: master branch" + "child git: release/1.0 branch"
  2. "child git: master branch"

This is of mechanism might require to observe not only two repository's, but also switching branches in case of long lasting failures. (E.g. development team of child git does not care about build breaks occurring in main git)

Suspect there does not exists any ready made tools for this purpose (please leave comment if such will appear), as this might require reconnect tools from two potentially different build chains from potentially different git repositories, from potentially different organizations.

And last, but not least - if you want your application to be software update capable - then nuget/choco distribution download could be used with additional nuget/choco server.

Here is good example of download part for nuget package: https://github.com/mwrock/NugetDownloadFeed

Install package building is bit more complex issue - see for example following links:

(In order best-worse based on my opinion)

TarmoPikaro
  • 4,723
  • 2
  • 50
  • 62
0

You can use Reference Paths in Visual Studio. I use that in Asp.Net Web Api project and works for me But I have problem in Asp.Net MVC (not Api)

enter image description here

when you load the project, Visual Studio ignores the Reference Path and when you unload the project Visual Studio use the precompiled reference and use can switch between them.

Alireza Ahmadi
  • 8,579
  • 5
  • 15
  • 42