Hacker News new | past | comments | ask | show | jobs | submit login
Build2 – A C++ Build Toolchain and Package Manager (build2.org)
90 points by beagle3 on May 4, 2017 | hide | past | favorite | 76 comments



Ugh. Hate to be Debby Downer. I mean, nice effort, but one of the crappy things about some of these newish languages is how each of them comes with their own package manager baggage that is different from the package manager that all the rest of your software is installed through.

So I have my Debian setup and I'm happily using apt to install all my software. Now ruby comes along and I need some libraries and have to go and install everything with gem. And I can't use apt to see what gems I have installed and vice versa! Argh!!! Now I have to write some python code and here comes pip... And node/npm... and on and on.

apt-get install libx11-dev is just fine for my C and C++ development needs, thankyouverymuch. I don't see the value in yet another incompatible package manager.


Once you've started using language package managers, it become very apparent that they have huge value, that apt-get doesn't bring.

In particular, you get the exact version of the library you want. If you've ever had the experience of trying to make your software work with multiple versions of its dependencies because different OSes ship with different versions, then you'll know how annoying this is.

This is particularly true of run-time problems. If they don't compile, then that's relatively easy to fix. If there's a difference at run-time, its often way more expensive to fix.

With a package manager, you get the exact version, but you also can use different versions for different apps on your system. You can also upgrade without breaking other apps (new security version) or not upgrade if there's some good reason not to.


> In particular, you get the exact version of the library you want. If you've ever had the experience of trying to make your software work with multiple versions of its dependencies because different OSes ship with different versions, then you'll know how annoying this is.

Isn't this just kicking the can down the road though? The incompatibilities are still something you either have to deal with or users end up with 58 outdated versions of each library and the security vulnerabilities they bring.

> Once you've started using language package managers, it become very apparent that they have huge value, that apt-get doesn't bring.

This is what's wrong with so much modern development, we do what's easy for developers, not what's best for users.


> This is what's wrong with so much modern development, we do what's easy for developers, not what's best for users.

In the client software case, software that works is what's best for users. If you upgrade a library and apps start working, that sucks.

If you're running server-side software, you want dev-production parity, or you can't know it works. So you _need_ to use a package manager. But the same applies to client software: you can't test all these permutations, so you can't even tell it works, and it probably doesn't.


I'm not arguing against using a package manager. I'm arguing against using 5 of them that don't know about each other.


I am interested in such a thing (e.g. ruby bundler knowing how to install native dependencies from any package manager), but that means pkg-config or similar ubiquitous tool would have to be able to query all kinds of servers in order to see the versions of packages available.


FreeBSD has had a package called bsdpan for some time that will register items installed through CPAN with the package manager. You may want to poke at that for inspiration on how to do it for other languages. Offhand, for handling multiple systems, it may be most useful to check something like uname and parse accordingly to figure out what's what.


You don't think software that's secure is important for both cases? Open up a random project with a language dependent package repository and tell me how many of the packages are up to date? More often than not the answer is zero with commercial software.

If you're running server side you should have test environments that match prod to be able to catch incompatibilities early.


For what it's worth I completely agree with you. I've recently started learning Rust and the only thing that truly bothers me about it is that it appears as though you _must_ use Cargo to manage your dependencies.

I work for a company that supports a distribution (SUSE) and I also participate in a distribution's community (openSUSE) and this mentality of "let's just reinvent package management" really does bother me, because it feels like the benefits of having a single package manager for the entire system is being ignored. And this ease of pushing new packages results in a huge amount of small packages so that a standard "hello world" in many frameworks requires something like 700 packages (from a distribution side, that's 700 downstream packages that we now need to curate and maintain).

Don't get me wrong, language package managers do have benefits, but they make life absolute hell for distributions. And distributions having full control of package management definitely brings benefits to users. Rust is the most extreme example, where projects can require a particular version of the nightly build of the compiler. How the hell is a distribution meant to be able to build and ship Rust code in that instance (given that distributions also provide the development tools necessary for working on said projects)? Do we create a new package for every nightly compiler build? How do we curate it?

One of my colleagues gave quite a fair review of the situation and some potential future steps in his talk "Package Managers all the way down"[1]. But bringing this mentality verbatim to C++ is not a step in the right direction IMO.

[1]: https://www.youtube.com/watch?v=4ua5aeKKDzU


You don't have to, it's just the path of least resistance.

We've been working with distros on packaging stuff; it's mostly sorted. Each distro is doing the thing that works for them.

Additionally, making sure we play well with other build systems is a big goal this year generally.


> We've been working with distros on packaging stuff; it's mostly sorted. Each distro is doing the thing that works for them.

Have you worked with openSUSE, since as far as I know devel:languages:rust is still in the state of "we aren't sure how to deal with these problems". In particular, there aren't any rust rpm macros in openSUSE yet, so there doesn't appear to be a clear way to create an rpm for a particular Rust project.

I've written several things in Rust that I would love to package for our distribution, but I just don't know how we are meant to do it without depending on a network connection to download dependencies out-of-band (outside of rpms dependency system).

> Additionally, making sure we play well with other build systems is a big goal this year generally.

Yeah, that's the big problem at the moment. OBS provides huge benefits (rebuilds when dependencies change, tracking of security bugs and fixes) but it also has reasonable restrictions like not being able to connect to the internet.

Having a better story in this respect _really would_ help Rust in its adoption in distributions.


OpenSuse is a bit behind, say, Debian, but I thought there were people actively working on it. Las I remember, Rust and Cargo were packaged, but they weren't sure if they wanted to follow in Debian's footsteps and automatically convert packages or do something more manual like Arch.

As long as ou have the stuff previously downloaded somewhere, offline builds of everything are totally supported; they're required not only by distros but by Firefox, for example.


Also, for cargo's usecase (mostly-reproducable builds, a space that rubygems also attempts to fill) system package managers are simply insufficient. Nix is the only one that comes close, and you

A) can't remotely expect anyone to have that installed already B) have no control over the repository or toolchain


Let's say that you and I both create a library. You try to publish your C++ library to the standard debian repository, and I'll try to publish mine to cpam, npm, pip, etc...

If you actually manage to get your library included, then we can compare how long it takes to update your library.


Systems like Debian, though, have the ability to add custom repositories. So after the initial, possibly small package that just configures the sources.list, everything's handled with all the other updates. That's something that npm, etc, can't do.


Npm and many other package managers support custom repositories just fine


Sure. I publish to deb repos all the time.

(I've found Bintray to be the easiest.)


I didn't say "a" deb repo, I said "the standard debian repository". Just like when I'm talking about NPM, I'm talking about the main NPM repository, not my company's private NPM server.


Well, you've chosen a terrible comparison then.

How many NPM repos are there? Just one, too-big-to-fail, centralized repository.

Don't ding apt/deb just because people setup their own repos rather than put all the code in the world into one bucket.

There's a lot more inconvenient things than adding the /etc/apt/sources


> How many NPM repos are there? Just one, too-big-to-fail, centralized repository.

There are actually plenty, it's just that they're usually found on-prem at a company.


The reality is that these tools add value that extends beyond mere package management. You are right though, if you just use them to install things, and not as part of a development process, they might seem superfluous.


Well I think exactly the opposite. In C++ it's a nightmare to build a prototype with a new library. You spend a huge time setting up the build system for that particular library, need to figure out how to plug into your code, fix dependencies (which in practice always have something broke/missing). That being, I think a propper package manager would add value to c++ dev,


> apt-get install libx11-dev is just fine for my C and C++ development needs, thankyouverymuch

... on Linux.


Common problems with using distribution package managers for software development:

- not all versions are available, and the versions which are available is different across distributions

- even if the version is the same, the location is not consistent, necessitating search logic to find libraries to link to and headers to include

- even if the version is supposedly the same, the distribution may have enabled different options, fiddled with the package or otherwise made it unsuitable for your use. Upstream developers often loathe to support the version your distribution uses.

- multiple versions cannot be installed independently, and the version you need may conflict with the packages you are using on your system but are not dependencies of your project

- This likewise can cause conflicts between different projects you are working on, causing a need for VMs and projects like vagrant, which are very heavyweight and mean you need to maintain multiple copies of your preferred editor and IDE setup.

- Forking a dependency to fix or customize it (to submit a fix upstream or not) gets even more painful.

All of these are much less painful for language based package managers.


In theory there is no reason why build tools can't be integrated with package managers so you don't experience that kind of silliness.

For development purposes, you may want to have the system build everything from source (and/or fetching pre-compiled modules from a build system repository), and output a self-contained tree of local files that you can run.

For release purposes, you can in theory output a package manager package (RPM, .deb, etc.) that contains only your own code's binaries and dependency declarations that the package manager can go satisfy.

There seems to be some degree of integration with build2 (see https://build2.org/faq.xhtml#why-syspkg ) though I don't know how much exactly or whether they go as far as what I'm describing.


> apt-get install libx11-dev is just fine for my C and C++

Language level dependency managers tend to cover a much wider selection of the extant third party library ecosystem than system level package managers.


Because Debian isn't the only game in town for OSes?

I for one don't want to bother with M×N packages.


Nice work. However, I think the C++ community is best served by standardizing on CMake. It is widely used by many projects, and many IDEs natively support it natively (for example Jetbrains Clion, and Microsoft Visual Studio 2017)

In addition, "modern CMake" is much better than classic CMake" and the project is improving with such things like CMake server.


CMake is not a package manager or package repository, though. CMake can discover libraries and it can build your code, but that assumes you've already fetched a dependency somehow. Sure, you can use system package managers, but these are different on every platform.

I've been writing a simple video game in C++ recently, using CMake to build. I wrote a little bit of code that fetches the few libraries I need in order to have a complete/deterministic build process that works on linux/windows/mac. It lets me change versions (by specifying a new url to a source tarball). Versioned dependency management is so typical and convenient in many other languages and is such a pain in the butt with C++ because CMake is this thing that's sort of good enough. But my CMake experience doesn't compare well at all with maven, pip, gem, cargo, or even npm.


Conan (https://conan.io/) tries to tackle the package management for C/C++.


While Conan still has some room to grow, I love the approach they take to package management for C++.

Too many C++ tools couple package management and build which increases their adoption cost and makes it more work to package third part packages. Conan takes the approach of managing dependencies and then giving your build system of choice the information it needs to access those dependencies via conan plugins. I think I had read that the author of Conan was involved in biicode and that the third party packaging problem is what inspired him to create Conan.

Some other cool things about conan

- Can be used for more than C++

- SCM independent

- Artifact caching

- Options allow clients to customize your package (creates distinct artifact)

- Scopes allow developers to modify your build (like conditional build steps so doesn't create a distinct artifact)

Some things I think can be improved

- Needs to separate out dependency lock file like Cargo or Poet

- Build tools need to be packaged in Conan for tracability / reproducibility

- Plugins are in Python which is heavy for deployment and has a large compatibility surface

- Easy to hit limits of declarative dependency files, being forced into implementing them in Python (see above)

- Project templates are hard coded.


> Too many C++ tools couple package management and build

Conan does this too. If I follow the instructions to integrate cmake with conan, I would write something like this:

    include(${CMAKE_BINARY_DIR}/conanbuildinfo.cmake)
    conan_basic_setup(TARGETS)

    add_executable(timer timer.cpp)
    target_link_libraries(timer CONAN_PKG::Poco)
Which now couples my build with conan. If the user would like to pull down the dependencies without conan(ie using apt-get or manually installing them), the cmake build script will not work.

Package managers like cget[1] or conda[2] does not couple the build with the package manager. Cget is also serverless which is nice as it can install dependencies from anywhere(even directly from github with `cget install jgm/cmark`), however, you need to setup your own server hosting if you want to install binaries.

[1]:http://cget.readthedocs.io/en/latest/

[2]:https://conda.io/docs/intro.html


> Which now couples my build with conan. If the user would like to pull down the dependencies without conan(ie using apt-get or manually installing them), the cmake build script will not work.

http://docs.conan.io/en/latest/integrations/cmake/find_packa...


I dislike it is written in Python instead of C++.

No need to force me to install Python just to run a build tool.


The nice things about cget is that even though it uses python(which is mainly for easy distribution), the cget protocol only has a dependency on cmake. So cmake modulues like cmake-get[1] can install cget recipes directly in cmake without needing python installed.

[1]:https://github.com/pfultz2/cmake-get


Python is a great solution to the problem as well as being installed on most Linux distros by default. Chances are devs would need for their other dev tools anyway even not on a Linux machine.


A build tool should be written in the same language as it is targeted for, no need for third party dependencies.

Many of us only install what we actually need, or what the IT department allows on our images.

I certainly don't want to open a IT ticket to get Python, just to be able to compile random package X.


I maintain and use http://teapot.nz for development. It's made my life so much better - it's designed for developers.


Nice, I will definitely try it out!


Thanks, I'd love any and all feedback. I use it almost daily and my goal is to make it as awesome as possible!


Yep, I'm a huge fan of this for this reason. I've spent a lot of time in my life messing around with cmake, scons, custom bash scripts... you name it, trying to get around C++'s shitty dependency management. And then recently I picked up Ruby, for the first time in a year or two, to write a little chat bot... 'gem install' is flawless across my Windows desktop and my Linux laptop, easy to depend on a specific version, easy to upgrade, etc. Tbh I'm pretty jealous.

e: does anyone know if this has any relation to Boost's b2? I know that was originally based off of jam which is not related but the name is so similar I can't help but think I'm missing something :(


'gem install' hasn't worked flawlessly in my experience (on both Windows and Linux). It's more likely than not that I'll have a gem fail to install on me due to some dependency or configuration issue. It's consistent and frustrating enough where I shy away from Ruby now.

I like Ruby more than Python as a language, but I will say that I've never had pip fail to install a dependency on me.


A lot of Ruby libraries just happen to use native extensions that rely on OS-packaged libraries being installed.

I'm surprised there isn't any part of the gem process that calls `pkg-config`, to verify if a gem will be able to install, given its native dependencies. Having `bundle install` bail out after just a second with "please install libfoo first" would be great, especially for CI.

> I've never had pip fail to install a dependency on me

`pip install` has never failed me, but `pip install --upgrade` fails for me quite a bit. Especially when I try to use it like `gem upgrade`, with a stanza like

    pip list --outdated --format=legacy | cut -d' ' -f1 | xargs pip install --upgrade


Maybe because `pkg-config` is mostly a GNU/Linux thing?


pkg-config works on windows with visual studio as well.


What about IBM i, z/OS, ClearPath, INTEGRITY OS, Solaris, HP-UX, Tru64, Aix, ..... ?


The problematic parts of gem/pip install are....the C parts.


And if they were installed from a gem how long do you think it would take for security patches to be applied for every install? Image magick is a popular gem plugin on look at the vulnerabilities:

https://www.cvedetails.com/vulnerability-list/vendor_id-1749...


Actually cget can install packages directly from cmake packages(or other build systems). There is also the cmake-get module which can install cget recipes directly in the cmake without needing cget or python installed.


I second what you are saying about modern CMake. When I first switched to CMake years ago I grew to regret it. It worked but it was a total mess. Then a couple years ago I saw someone mention "modern CMake" and I googled around and realized they had implemented a much more sane approach since I first learned it. There is more to it than just this but a lot of the time you can just do a few find_package(library) commands, an add_executable(program [source files]), and finally a target_link_libraries(program [libraries]) to declare your dependencies and everything just works. Include paths are correct. Private dependencies aren't cluttering up stuff downstream. Debug and release version of libraries work as expected. Build flags and compiler definitions only apply to the modules you intended them to apply to.

This is why I've been liking the approach Microsoft has been taking with vcpkg. They clearly see which way the wind is blowing. It's not restricted to using CMake but there is a strong preference for it and it uses CMake as its build scripting system.

I just barely got done trying it on a large project for dependencies and it worked very well overall. A few commands to install a dozen or so dependencies and a quick change to add the vcpkg CMake toolchain so Visual Studio will pass it to CMake. After that and I can just use Open Folder on my CMake project in Visual Studio 2017 and everything just works. It's still a little rough around the edges but vcpkg seems to be developing at a breakneck speed for how new it is.

Dealing with dependencies is the worst part about being a C++ Developer and I'm glad to finally have something that makes it this easy (on one platform at least; I still get to rip my hair out on macOS).


I just wrote a bash script collection built on cmake that allows me to build and deploy libraries and modules for easy use in any project I want, without having to worry about all the include paths, linking, etc. I just add the name of the module to a Cmake variable and the scripts do the rest, making sure everything is found and included. Even third parties work nicely this way since I can just build them using my cmake scripts and have them automatically exported as modules for other projects to use.

Setting this up was a couple weeks of learning curve but I agree it is worth it to invest in Cmake and also learn the low-level details about how it all fits together since after all we are dealing with a native low-level language, it also pays to see under the hood of these "package managers" and build systems so you can be fluid on any platform where you need to build.


You can actually do this from within CMake itself with a bit of a hack

Here is someone who did something similar:

https://crascit.com/2015/07/25/cmake-gtest/

If you're interested I can upload my version a bit later


This sound similar to cmake-get:

https://github.com/pfultz2/cmake-get


Like the other comment says, you need to have the dependencies already prebuilt.

You can say do things like

"link to these header and this library" - and the modern-er syntax makes that cleaner

but you still can't do a simple

"Get this dependency, build it, then link my current project to it"

To me this is a very strange and frustrating limitation b/c all the necessary piece are already there! It just doesn't have the functionality to call "build it" in the middle like that for some reason. I actually ended up hacking it in (it's like 50 lines of code) but it's a bit ugly and doesn't work recursively

EDIT: It's also very inconvenient to do out-of-source builds with dependencies... constant source of frustration


I've started with and am used to the "old-school" CMake. Is there a good article that showcases some of the more modern features?


As a new cmake user these two presentations have been essential in getting started with "modern cmake":

[0] http://thetoeb.de/2016/08/30/modern-cmake-presentation/

[1] https://www.slideshare.net/DanielPfeifer1/cmake-48475415

And this blog post also has some good stuff:

[2] https://rix0r.nl/blog/2015/08/13/cmake-guide/


Well, I'm in the middle of writing a book titled "Effective CMake" [1] where I'm going to explain how to use CMake in a nice and modern way and some of the tricks which are not covered by the (very good) documentation.

[1] https://leanpub.com/effective-cmake


No. I tried to install cmake and it wanted to install over 1000 files on my computer. I'm not prepared to install that level of super-bloat just to build a program


I've been by this page a few times. It's interesting to see work in this area, but there are only 15 packages on cppget.org. Obvious ones (boost, protobuf, folly, etc.) are missing from the list.

Is this still in some sort of alpha state with respect to new package submissions? Or is nobody volunteering to maintain the missing packages?


Any compiler already supporting some experimental version of modules?

If library devs could start trying those out, it might give it some traction...


I feel like the name bpkg is taken. BSD uses it and a search shows there's a Bash package manager that calls itself bpkg also.

Maybe call it cpkg?


this is awesome we need the maven of cpp


> maven of cpp

Oh dear god no. Can we please have a way more sane manager than a maven of cpp?


What are the problems you encountered with maven (I've only used it a few times)?


What we really need is the cargo of C++


It's staggering to me how much time and effort we have to put into our build system for our C++ project at work. Having come to the C++ world after learning Rust, I'm constantly pining for the dead-simple "just works" nature of Cargo.


I am still waiting for Cargo to be able to do what we do in our C++ builds, specially in what concerns build time.

Lack of support for binary libraries, specially among projects is a big deal breaker.

Watching the same dependencies being compiled multiple times isn't fun.

Yes, downloading the libraries, placing them in some directory and configuring paths might take some time.

But afterwards, C++ builds are quite fast as they just link to the provided libraries, instead of building the world every single time.


That's fair, although "building the world every single time" seems like a bit of an exaggeration, since cargo only recompiles dependencies if they're updated (or you specifically clean your build). The project I work on has relatively few dependencies, so the benefits of not having to specifically define different targets and spend time and effort supporting different toolchain- and platform-specific minutiae feels like it would be a big win for us.


It is not an exaggeration, because Cargo doesn't support avoiding recompilation of the same dependencies across multiple projects.

So if project A and B both depend on the same version of lib X, without any change on build flags, lib X will be compiled twice.

In C++ using binary libraries, only liking will occur.

Even if compiling from source is required, build systems like Clearmake allow for sharing object files across projects.

I do expect that Cargo might eventually support such scenarios as well, otherwise it won't be appealing to some of us.


Right, but you only have to do that once per project, unless you make a habit of clearing your dependencies quite often. Personally, I'd rather spend two minutes compiling my dependencies once every month or so than have to maintain a complex build system for a project in order to support different toolchains and platforms. I guess if your project is relatively small or constrained to certain toolchains/platforms, it wouldn't be as much of an issue, but at least for the project I work on, maintaining the build system is definitely a non-trivial amount of effort.


You are forgetting crates don't exist in isolation, rather they have a dependency graph, many times with overlapping nodes.

Those two minutes are actually around ten minutes to compile something like rustfmt, just because binary dependencies aren't supported.

My average C++ projects, mixed with Java or .NET code are the ones actually taking two minutes, on the same system.

Truth be told they would take much longer when compiling everything from scratch, but we never do, because we already have the binary libraries.

So it is a bit hard to sell Rust when the toolchain is slower than C++, which I am pretty confident it will certainly improve in any case.


Coming from Java, I would absolutely love to get my dependencies via Maven. It would even allow for multi-architecture/platform binaries and source dependencies.

In fact, another company in our corporation has gone the other way, packaging Java dependencies via nuget, and it's also OK.

Why can't people just get over their useless aversion against proven, mature solutions just because it's Java or some nonsense?


Why would I use this instead of Nix?


Last time I looked at Nix it had a problem with dealing with cross-compilation. The problem is mainly that many packages still use autotools which does not have a descriptive toolchain. Ultimately, it could provide a descriptive toolchain and then map that toolchain to the different build systems, which is similar to what cget does, but perhaps a separate tool to do this could be helpful so this could be shared among different package managers.


why not use github for package repository?


Do you mean git? Not sure what difference github makes over vanilla git in this case..

GoLang kind of does this (via "go get") and it sucks for multiple reasons:

-you can only distribute packages as source, meaning you need to compile them yourself (this has pros and cons, but it would be nice to be able to distribute binaries if you want)

-your dependencies reference a commit hash, and unless you micromanage your dependencies and constantly look at the github repository's tags, you have no semantic versioning, making it a lot more likely you'll encounter breakages

-there's no way for your dependencies to specify dependencies, unless they're using git submodules (very few projects do), and even with submodules, you can't share dependencies across packages

The list goes on..

Git is for source control. Package managers offer a lot of different (and awesome IMO) features. Lack of package management is one of my biggest gripes with GoLang honestly.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: