Talking about Python "somewhere in the middle" - I had a demo of a simple webview gtk app I wanted to run on vanilla Debian setup last night.. so I did the canonical-thing-of-the-month and used uv to instantiate a venv and pull the dependencies. Then attempted to run the code.. mayhem. Errors indicating that the right things were in place but that the code still couldn't run (?) and finally Python Core Dumped.. OK. This is (in some shape or form) what happens every single time I give Python a fresh go for an idea. Eventually Golang is more verbose (and I don't particularly like the mod.go system either) but once things compile.. they run. They don't attempt running or require xyz OS specific hack.
Gtk makes that simple python program way more complex since it'll need more than pure-python dependencies.
It's really a huge pain point in python. Pure python dependencies are amazingly easy to use, but there's a lot of packages that depend on either c extensions that need to be built or have OS dependencies. It's gotten better with wheels and manylinux builds, but you can still shoot your foot off pretty easily.
Python is near the top in languages that have given me trouble in other peoples' production software. Everything can be working fine and then one day the planets fall out of alignment or something and the Python portion of the software breaks and the fix is as clear as mud.
I'm pretty sure the gtk dependencies weren't built by Astral, which, yes, unfortunately means that it won't always just work, as they streamline their Python builds in... unusual ways. A few months ago I had a similar issue running a Tkinter project with uv, then all was well when I used conda instead.
Yeah.. this is exactly the overall reality of the ecosystem isn't it? That being said I do hope uv succeeds in their unification effort, there's nothing worse than relying on a smattering of diff package managers and built streams to get basic stuff working. It's like a messy workshop, it works but there's a implicit cost in terms of the lack of clarity and focus for the user. It's a cost I'm not willingly paying.
It may not be the grand unifier if they aren't willing to compromise. Currently I'd say conda is the "grand unifier", giving users 100% what they ask for artifacts-wise, albeit rather slowly. On the other hand, uv provides things super fast, but those things may break 5% of the time in unusual ways on unusual configs. I have no issue using both for the fullest experience.
I've had similar issues with anaconda, once upon a time. I've hit a critical roadblock that ruined my day with every single Python dependency/environment tool except basic venv + requirements.txt, I think. That gets in the way the least but it's also not very helpful, you're stuck with requirements.txt which tends to be error-prone to manage.
Yup, but 5 to 15% faster year on year is real progress and that's ultimately what the big user base of Python are counting on at this point.. and they seem to be getting it! Full disclaimer: I'm not a heavy Python user exactly due to the performance and build/distribution situation - it's just sad from a user-end perspective (I'm not addressing centralised web deployment here but rather decentralised distribution which I ultimately find more "real" and rewarding).
Anything out there for reference or would you be implementing from theory/ideas here? God speed to you in terms of the project overall, it's exciting to see the beginnings of a rust-like-lang without the headaches!
Like with smart TV with advertisements, the locked down cars will provide revenue after sale for the manufacturer. Thus they will mass produce and sell those cars for a cheaper price, while adding a significant markup on the non-locked down cars to be primarily sold to commercial partners at lower batches. The cheaper mass produced one will significant out compete the other ones to the point where most dealers will no longer sell those more expensive versions. Most customers won't even know the difference when they buy the car, and why should the seller go out of their way to inform the buyer when most people just care about the sticker price anyway.
Having locked down products also allow the manufacturer and dealer to form a agreement of sharing some of the post-sale revenue. The dealer can have their own shops for repair which the customer are now forced to use, naturally at a steep markup, and the manufacturer get a slice for every repair. The cheaper sticker price can then be decreased further since the dealer now have an additional revenue after sale. The customer can go to an other manufacturer approved shop, but then how many of those exists in the same city, and the manufacturer can artificial limit how many shops get approved in the same location.
The story is the same across any number of industries right now. Customer choice is not a argument for the manufacturer to not do it. If you as the customer want to opt out, the only choice after a while will be to not buy a car or buy the expensive ones for double the cost.
If you crash into a BMW, you'll still have to pay for the now-inflated cost of the repair - most likely through the mandatory liability insurance. Insurance premiums have gone through the roof, to a big part not just due to insurance company greed but because cars and repairs to them became more expensive.
Sensible take, thank you. When HN get these "our project: from x to y language" frontpage stories I am always thinking that it would be far more exciting with "our project: 38.2% smaller code base by optimizing our dependency use", "our project: performance optimized by 16.4% by basic profiler use" or similar!
Is the trade off here having more secure code in exchange for added complexity/difficulty? This is a real question, has the Tor code itself been exploited by bad actors before? All the incedences I've seen in the news were some other software running over tor that would be exploited to phone home or give up user data.
It seems they worry about it, which I can understand. But now with Rust I worry about about new logic bugs, supply chain issues, and lack of proper security updates.
Or, you could look at other projects who have been using Rust for many years, and consider these factors there too. The folks who have have generally concluded the opposite.
The distribution I use already has limited security updates for Rust: https://www.debian.org/releases/trixie/release-notes/issues.... which reduces my security. The cargo supply chain issues are also very obvious, I am far more worried about this than I ever will be about memory safety, but hopefully tor reduces its reliance on random dependencies.
I find that surprising given that Debian breaks Rust programs up into individual apt packages, but ultimately, other distros do not have this issue. It’s also about userspace programs and not the kernel, which does not use external packages and so sidesteps this completely.
Debian forky has Rust in the kernel on by default.
Right, from my understanding, Debian was packaging Rust programs in the same way as C ones. So they’d update the individual library and it should be all good. They deduplicated all of the dependencies in their trees.
This seems reasonable to me. If you have a tarmaggeedon, you update one library instead of thousand of packages. Although I am not sure how well this can work in Rust with monomorphization.
Do you think the sibling comment was flagged to "protect me" whatever that means, or is it because "Are you high on Prozac?" is not really a productive comment?
EDIT: And now that I've scrolled down, I see you've left this comment many times as random replies. I'm sure those will get flagged, but for spam reasons, not due to some grand conspiracy.
Isn't this just the same value judgment mistake? You're just presupposing that things like "smaller code base" are better in virtue of themselves the same way that "rewritten in Rust" might be as well.
The parent poster's point is seemingly to reject "this is simply the better thing" (ie: "small code is better") and instead to focus on "for what we are doing it is the better thing". Why would "basic" profiler use be better than "niche" or "advanced" profiler use if for that context basic would actually have been inferior (for whatever value of basic we choose to go with)?
It seems to me that the reality we're often confronted with is that "better" is contextual, and I would say that "basic" or "smaller" are contextual too.
I think the chance that your Rust application is going to be more performant or efficient than C, is whether you are focused on writing performant and efficient code. Out-of-the-box, I’m guessing people will use too many cargo packages, each that are over-engineered or written by less-experienced developers, so it will be less efficient and less performant.
In addition, you could more easily inadvertently introduce security problems.
Is Rust the right choice for Tor? Sure. Is Tor the right choice for security? If they moved to Rust, they increased security risks to make it easier to manage and find help from younger less-experienced developers, so no.
Given how heavily most C programs lean on type erasure vs. monomorphization and how often they reimplement basic data structures, it's kind of a miracle they hold up against Rust/C++.
> I think the chance that your Rust application is going to be more performant or efficient than C, is whether you are focused on writing performant and efficient code.
I believe that depends on the sophistication of algorithms. High-level algorithms (especially if they involve concurrency or parallelism) are much easier to write in Rust (or in C++) than in C, which gives them a pretty good chance to be at least as fast as any reasonably safe C implementation.
For low-level algorithms, of course, it's really hard to beat polished C code.
> Out-of-the-box, I’m guessing people will use too many cargo packages, each that are over-engineered or written by less-experienced developers, so it will be less efficient and less performant.
I don't think that this is going to be a problem. The Tor Project developers I've interacted with sounded quite serious about security. Forbidding non-blessed cargo packages is pretty trivial.
> In addition, you could more easily inadvertently introduce security problems.
Use the Brave browser and look at the inbuilt filtering (search for "Content Filters" in settings), it allows explicit removal of shorts via enabling of "YouTube Anti-Shorts" filter list. Does the job beautifully.
Basic-to-great rationality or skill may not be what is being rewarded here (although the baseline of course needs to be met) - it could well be compliance capability. Hence the string of arbitrary memorization exercises.
reply