Hacker News new | past | comments | ask | show | jobs | submit | jhoho's comments login

Let me introduce you to the beautiful world of virtual environments. They save you the headache of getting a full installation to run, especially when using Windows.

I prefer miniconda, but venv also does the job.


And this works about 25% of the time. The rest of the time, there is some inscrutable error with the version number of a dependency in requirements.txt or something similar, which you end up Googling, only to find an open issue on a different project's Github repo.

Someone needs to make an LLM agent that just handles Python dependency hell.


"Someone needs to make an LLM agent that just handles Python dependency hell."

This is why they are constantly delaying GPT 5.


Haha fair enough—if they can solve the Python situation, I'd be fine slapping the "AGI" label on it.


As someone who doesn't develop in python but occasionally tries to run python projects, it's pretty annoying to have to look up how to use venv every time.

I finally added two scripts to my path for `python` and `pip` that automatically create and activate a virtual env at `./.venv` if there isn't one active already. It would be nice if something like that was just built into pip so there could be a single command to run like Ruby has now with Bundler.


`uv` is great for this because its super fast, works well as a globally installed tool (similar to conda), and can also download and manage multiple versions of python for you, and which version is used by which virtualenvironment.


While my uv use is still early days, i would second this recommendation. I've found it to have the functionallity i miss from conda in venv, but faster and more reliable than conda.


I am also using conda and specifically mamba which has a really quick dependency solver.

However, sometimes repos require system level packages as well. Tried to run TRELLIS recently and gave up after 2h of tinkering around to get it to work in Windows.

Also, whenever I try to run some new repo locally, creating a new virtual environment takes a ton of disk space due to CUDA and PyTorch libraries. It adds up quickly to 100s of gigs since most projects use different versions of these libraries.

</rant> Sorry for the rant, can't help myself when it's Python package management...


Same experience. They should really store these blobs centrally under a hash and link to them from the venvs


Virtual environments with venv don't answer the python version problem unless you throw another tool into the mix.


uv solves this problem.

  uv venv python3.11
done.


conda and uv do manage python versions for you which is part of their appeal, especially on systems that don't make it super straightforward to install multiple different versions of pre-compiled runtimes because their official OS channel of installing python only offers one version. At least on macos, brew supports a number of recent versions that can be installed simultaneously.


Hmm? My venvs do include the Python version (via symlink to /bin). Don't yours?


If you use something like uv (expanded here: https://news.ycombinator.com/item?id=43904078), I think it does. But if you just do `python -m venv .venv`, you get the specific version you used to create the virtual environment with. Some OSes seem to distribute binaries like `python3.8`, `python3.9` and so on so you could do `python3.8 -m venv .venv` to look one env to a specific version, but a bit of a hassle.


The GP's problem was (apparently) an inability to install the right python version, not an inability to select it.


Conda does! `conda create -n myenv python=3.9`, for example


While true, the benchmarks are not run on the Ryzen's NPU but the much stronger GPU.


It's because of the bigger VRAM - 70B parameters don't fit into the 4090's 24GB.



Not sure how widely known this is, but recent studies have shown great, sustained results for type 2 through dietary interventions using wholegrain oat (as it contains beta-glucan): https://www.thieme-connect.com/products/ejournals/html/10.10... https://www.sciencedirect.com/science/article/pii/S221479931...


Type 2 has had a high correlation with obesity and high carb diets.


But interestingly also a very high genetic factor with 90% of identical twins both having T2DM (which is greater than that of type 1 which if I remember correctly is 40%)


> very high genetic factor with 90% of identical twins both having T2DM

Or both not having it, I hope?


Sounds like Nature vs nurture to me. Until there is a proposed genetic marker... it's just another item confusing the public about correlation vs causation.


Looked at the first paper. I have significant concerns that, frankly, I didn't finish reading.

1. Small sample size, <20 iirc. 2. No control group at all. (There should have been a group under the same requirements and same diet) 3. They picked 'uncontrolled', and from my own experience that term is synonymous with "unmanaged." Which, translates to "patient is not compliant with treatment." As such, feeding them exclusively a vague "diabetic diet" coupled with the 5 day hospital stay- well its enough to cloud the results enough that no conclusions can be made.

4. Cont. Because people rarely intentionally make themselves feel like crap- which you will with uncontrolled type II. The hospital stay, its exposure to allegedly* diabetic friendly foods, and subsequent time for the subjects to realize "I feel better, I like this!" Basically invalidates the entire paper.

* allegedly, because I just got out of a hospital with a fantastic cafeteria. But, the "diabetic menu" had way to many items with high glycemic indexes, and nothing to maintain a steady sugar level until the next meal.

Finally: ''HbA1c was lower four weeks after the oatmeal intervention.''

Two days of fasting won't change an A1c value.


There are several more studies and dietary recommendations regarding oat, just search Google Scholar and similar.


I'm skeptical of any claim that says consuming carbs is helpful when it comes to type 2 diabetes.


Can you recommend a resource on this for the curious learner?


That pretty much sums up their early adopter experience.

The PineTime ecosystem is pretty neat nowadays, just try out InfiniTime: https://github.com/InfiniTimeOrg/InfiniTime


The ASRock DeskMini X600 was presented one week ago. ASRock blamed AMD for the lack of affordable mainboards delaying it. https://www.asrock.com/news/index.asp?iD=5353


Oh, thank you! Looks like they did keep the DeskMini very much like the A300/X300, including keeping SATA.

Some other stories like https://videocardz.com/newz/asrock-announces-deskmeet-deskmi... suggest the DeskMini and DeskMeet can take a non-G 7000 CPU up to 65W; the 12-core 7900 would fit, though there's some substantial tradeoffs to doing that.


The soil in the Austrian Country of Styria has notoriously low iodine levels. When visiting Styria in 1748, David Hume wrote:

'But as much as the country is agreeable in its wildness, as much are the inhabitants savage, and deformed, and monstrous in their appearance. Very many of them have ugly swelled throats; idiots and deaf people swarm in every village; and the general aspect of the people is the most shocking I ever saw.' [1]

That's why some traditional costumes in that region include a so called "Kropfband" (goitre bound). [2]

It's fascinating how much one's place of birth used to influence a life and medical biography and in many countries still does.

[1] https://www.gutenberg.org/files/42843/42843-h/42843-h.htm

[2] https://de.wikipedia.org/wiki/Kropfband


You'll love what the community did with Supreme Commander, Chris Taylor's spiritual successor to Total Annihilation: https://www.faforever.com/

They even created a Coop-Campaign and a completely new faction.


I put a lot of hours into SupCom. It was always a buggy mess, but I do remember it quite fondly. Planetary Annihilation has felt like the right progression in the spiritual series. It was always a very smooth experience and fixed the technical issues with scaling up to massive scale combat while fixing the balance issues of turtling. The skill ceiling is immense and I could never beat high level bots. The nature of having one to many planets also means you can play cooperatively with up to several friends. It's a good time.


Supreme Commander's getting its own spiritual successor:

https://www.sanctuaryshatteredsun.com/


I thought Planetary Annihilation was a spiritual successor to both TA and SC


That's technically true. Planetary Annihilation was a heroic effort lead by Jonathan Mavor (formerly of Gas Powered Games) that fused elements of TA, SC, and its own crazy ideas on what amounted to a shoestring budget.

Supreme Commander's budget was about $50M USD (circa the mid-2000s), and Mavor was lead engineer on the project.

Beyond All Reason is arguably more similar to Total Annihilation than Planetary Annihilation.

By that same token, Sanctuary is the closest thing to a complete remake of Supreme Commander. Granted, Forged Alliance Forever has had such incredible work put in by the community that it comes close, but it's ultimately stuck on the same engine.

Turns out Mavor has a new company now, and I'm elated to hear someone's finally combining Factorio with a decent RTS component:

https://industrialannihilation.com/


I am incredibly excited for Industrial Annihilation, really looks close to my ultimate dreams RTS.

> IA is a unique blend of genres: deep factory building combined with real-time strategy action.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: