This might be a little side-topic but IMO Alpine is famous for having a small byte size footprint.
I love Rust to tears but it produces fairly huge binaries even in release mode and even with some symbols stripping.
So if Alpine also plans to include various Rust tools I'd hope that this will prompt the core team to also work on reducing the sizes of the final Rust binaries.
I don't want to sound entitled! I am very thankful to the entire Rust team. And Rust is hugely important nowadays. It's just that when it comes to distributing stuff in Dockerfiles (and fly.io, and several others) the Rust binaries stick out like a sore thumb. :\ They're fairly big.
This is definitely an area it would be great to see more work in. On Fuchsia we started linking the rust stdlib dynamically, but there are some significant ABI constraints to doing this in a traditional Unix shared library ecosystem that we're able to avoid due to the isolated and storage-deduplicated nature of the package system.
If you do link rusts stdlib dynamically, then you end up with binaries that are equivalent to C++ binary sizes with an external libc++, where rusts stdlib is approximately the same size as libc++, and the release mode stripped binaries are also comparable (within ~1kb of each other, and in the range of 10kb for a standard unix pipeline style application which does some basic stream processing of IO).
Is dynamically linking Rust's standard library officially-supported or did you have to hack around that? I know static-linking is the default but I've never looked into doing otherwise
The flag to dynamically link dependencies exists, and is part of the compiler's stable flags, but I'm not sure how much it's actually used, or how well it actually works.
Correct, the ABI is not stable. Given that they build a base system with a single rustc, that's not an immediate concern, though of course, a more stable ABI helps dynamic linking be more relevant.
It works well in terms of programs working. It has similar challenges to C++ that various commonly used abstractions end up producing a lot of code on the program side regardless. It's nonetheless worthwhile, if you have enough programs and resource constraints.
I am definitely not as well-versed as you -- but for now I'd stick with Rust's statically linking nature. What I am more looking for is to have some very aggressive tree-shaking at the final phases of Rust compilation/linking where the tooling can prove that in your program, say, 73% of the stdlib isn't used at all so it can just remove it from the final binary.
It's my view that as deployment moves more and more to containers and even to the "microkernel + your app only" model then being able to fit binaries in CPU caches will become crucial for the viability of such cloud hosting. (Although to be fair, when we're talking about Rust + the Linux kernel, running something in 25ms vs 5ms can't be a huge deal for 98% of all apps, right?)
Not sure why I am eating the down arrows but I'd like to specify: I just tested on a few projects of mine and the binary size reduction is definitely more aggressive and better-looking compared to a year ago. Very happy with the gains! I am leaving my original comment above anyway since there's IMO still some room for improvement.
For comparison, without LTO and stripping I've had a fairly small hobby project easily compile to ~3MB and with them it's now ~800KB which is nearly 400% improvement. Nice.
I’m not saying they’re not small and haven’t done a fantastic job, but my impression is that this is due to being mostly minimal in the default install;
The reason I say this (and I could be wrong) is because the use of musl for the libc means each binary kinda needs to include it’s own libc statically.
This must make the binaries huge!
Also, you can easily make small rust binaries, it’s Golang (of the two) that produces very large binaries.
> the use of musl for the libc means each binary kinda needs to include it’s own libc statically.
This is factually incorrect. Musl supports dynamic linking and if you run `ldd` on any binary in Alpine you can see that it dynamically links against musl libc.
Trust in open-source's ability to populate all niches. If Rust is added, and it significantly bloats up the Alpine image, someone will come along and make Alpine-Classic or Montane Linux or something.
Maybe I'm misunderstanding but from what I understood, this isn't about adding rust to the base alpine packaging. It's about showing it to be installed from the main repository. Currently it's available in the community repository.
Docker acts as late linker for dynamically linked apps, bundling shared libs and executable together. Statically linked Rust apps certainly go against the grain in that model but can often be made significantly smaller with some build and dependency tweaking.
I don't know what do you like about the phrasing, but it's not really friendly. The word "oxidize" used like that is clearly a Rust community slang, so it signals that the author identifies with the (a?) Rust community, but it also has the effect of alienating those who are not members of the subculture.
Apart from that, I find it strange that people are so brazen with the whole "rewrite stuff in Rust" agenda.
I'm not a member of the Rust community (yet), but there was no great leap to understand the oxidation "in-reference", it may very well be community slang, but the link between oxidation and rust in the English language is not at all misunderstood.
As for the agenda of re-write everything in Rust, I cannot comment; as a software engineer I love re-writing things in my new favourite language of the day, so my judgement is clouded.
I read that comment not as "re-write everything in Rust", but more "rewrite a subset of components in Rust where it might help with security and resiliency".
Rust can certainly help with the latter, but very few projects will ever have the funding or time to achieve the former.
> Apart from that, I find it strange that people are so brazen with the whole "rewrite stuff in Rust" agenda.
They used to, but these days they calmed down a lot, and most actively fight against the RIIR (Rewrite It In Rust) meme. I think part of it is because lots of people were exposed to low-level programming through Rust. I know that when I first started working with legacy/old code, I had the urge to rewrite everything. It took me some time to gain respect and trust in the code that was already running. On the other hand, sometimes my insights as a newcomer were valuable. It's a balance you have to strike between the new and the old, as with all things. Young people have the energy and new ideas, old people have the experience and stability. Both are needed. They'll sometimes clash against each other, but with time everyone will gain respect for the other party. I feel like it's what happened with the Rust community. These days it's less "rewrite everything in Rust!" and more "Rust is an option here. Does it makes sense? What are the alternatives?". The language is not even 10 years old at this point. Same thing for the community. They've already matured a lot.
This is how weathering steel works, but for all other cases the instructions are to remove the oxidized layer and apply a protective coat of paint, because rust is typically pourous and provides no protection, but traps moisture.
Also stainless steel has an oxidised protective layer, but its transparent so we don't call it rust
It probably means different things to different people. Usually when I see the term "oxidize" it tends to mean rewrite critical parts of software, but not necessarily the entire thing (like swapping out the TLS library in a C implementation for one in Rust, but the rest of the C program remains the same).
This would be different from rewriting the entire program in Rust.
I’m not Latin expert, but I think iron, or ferus metals are the only ones that “rust”. Raw aluminum starts oxidizing the moment air hits it, anodizing is specially oxidizing aluminum then dying it, and of course this is more true for metals like potassium that dull while you are looking at them, but I’ve never heard anyone say these metals “rust”, they oxidize.
The Rust was not rolled back from cryptography. The new version of wheel and pip now use the precompiled wheel package for cryptography. Therefore, installing that package does not require the Rust compiler in Docker. However, this does not guarantee that the installation of other Python packages with Rust will also run smoothly.
What it does still do is make pyca/cryptography incompatible with niche architectures that were previously supported but are unsupported by Rust (due to lack of support by LLVM).
It previously worked, but no one was promising support for niche architectures. It just happened to work (at least enough that there weren't obvious bugs, who knows if fully). Expecting support for every non-standard way you can use the software is usually not a reasonable expectation of maintainers.
A functional c compiler on a platform is one of the oldest, basest assumptions that package maintainers make. This is a good thing for open source in general and not something that should get tossed aside lightly.
A functional C compiler on the platform means that your software is in the realm of possible and that somebody interested enough might help maintain it.
The alternative presented is "lol screw you". Not by these thoughtful posts, but if you look at the sentiment of the threads like when this happened in pyca/cryptography.
We did not roll back our dependency, no, although our current release (3.4.x) allows you to disable Rust compilation via an environment variable specifically so we could understand where in the ecosystem challenges would occur. Our next release (now 35.0 based partially on feedback from the community around our unusual versioning) will hard depend on it for all X509 ASN.1 parsing. During the months since we did our first rust release, however, the `musllinux` specification and implementation has been finished so we expect to be able to ship binary wheels for Alpine very shortly. I am actually working on that today, with the only remaining blocker being an update to warehouse to allow wheel upload.
I think I ran into this problem while trying to Dockerize ansible to be able to run on ppc64le.
For other reasons, I have a need to be able to run an up to ansible on Linux on POWER. I think due to a lack of an available wheel for cryptography on ppc64le, I ended up having to include cargo in the dependency chain and subsequently a pretty large docker image (over 200MB). Ugh.
You can avoid this by building cryptography as a docker stage and then just extract the wheel as part of your docker build process. Alternately if you install rust, build cryptography, and uninstall rust all in one layer then you can also avoid this issue.
That's what I currently do already but I did notice that I failed to uninstall gcc when looking at the Dockerfile again today so I'm now down to 150MB (after rebuilding). Still not great but better.
Currently my build uses Ubuntu, I'm going to try switching over to Alpine when I get a chance to see how much savings I can claw back.
I believe you should be able to avoid docker image size bloat by combining `install rust; do things that need rust; uninstall rust` into one run directive (same with anything that needs any compiler). It's not a runtime dependency.
The reason is that python is meant to run on Linux systems with glib, but Alphine has musl instead. It's a common source of compatibility issues not limited to python.
Note that pip wheels are only specified and only built for glibc compatibility, so the limitations extends to all python packages.
I don't understand why rust integration doesn't happen on a fork, especially for important packages like this. I'm sure people would elect to use py-cryptography-rust and it woule avoid an open-goal from people looking to seize on a rust-related headline in bad faith.
> It is assumed that by following the steps in this proposal that Alpine users will be presented with a functional Rust toolchain which has a maintenance window of 2 years, that is also supportable by the Rust community through its normal support channels.
I’m struggling to see the benefit of moving from community to main. It seems more symbolic and geared towards making certain guarantees for end users, but it’s hard to grok that
Alpines biggest issue isn't that it doesn't have rust in main, it's documentation. `apk` is a great package manager but a lot of it is undocumented and you need to read the source.
Python supports Alpine just fine; indeed, the standard Docker Python images include Alpine variants.
If you mean binary wheels, PyPA packaging added support for Alpine wheels a few months ago (https://github.com/pypa/packaging/pull/411), and auditwheel support for the same just shipped today.
I love Rust to tears but it produces fairly huge binaries even in release mode and even with some symbols stripping.
So if Alpine also plans to include various Rust tools I'd hope that this will prompt the core team to also work on reducing the sizes of the final Rust binaries.
I don't want to sound entitled! I am very thankful to the entire Rust team. And Rust is hugely important nowadays. It's just that when it comes to distributing stuff in Dockerfiles (and fly.io, and several others) the Rust binaries stick out like a sore thumb. :\ They're fairly big.