Hacker Newsnew | past | comments | ask | show | jobs | submit | garganzol's commentslogin

I know that some companies try to avoid LINQ as a rule. However, avoiding LINQ gives negligible gains most of the times.

Of course, if it's really a hot path like matrix multiplication then it makes total sense, but avoiding LINQ gives unpleasant side effect: loss of code soundness and quality.


I can understand avoiding it, it can get messy when used with large amounts of data fetched from an sql database, if the lazy dev uses Linq instead of implementing proper queries in SQL or using stored functions.


Interesting approach, but it should be .NET Runtime (or JIT) who would optimize small memory allocations preferring the stack rather than the heap when possible.

.NET 10 takes a step in that direction [1].

[1] https://learn.microsoft.com/en-us/dotnet/core/whats-new/dotn...


I want to share one more related observation: by definition, topology math refers to geometrical objects and transformations. But there exists another, more computer-esque definition of topology that defines relations between abstract objects.

For example, let's take a look at graph data structure. A graph has a set of stored objects (vertices) and a set of stored relations between the vertices (edges). In this way, graph defines a topology in discrete form.

Let's take a look at network data structure which is closely related to the graph. It is very much the same idea, but it additionally has a value stored in every edge. A network has a set of objects (vertices) and a set of relations between the objects (edges), while edges also hold edge values. So it is also a form of topology because the network defines the relations between the abstract objects.

In this light, you can view a graph as a neural network with {0, 1} weights. The graph edge is either present or absent, hence {0, 1} values only. The network structure, however, can hold any assigned value in every edge, so every connection between objects (neurons) can be characterized not only by its presence, but also by edge-assigned values (weights). Now we get the full model of a neural network. And yes, it is built upon topology in its discrete form.


Problem solvers based on graphs are hard to get your head around at first, but then you get extremely elegant and powerful solutions to seemingly unsolvable problems.

The only gotchas are: 1) time to get your head around 2) algorithmic complexity of the resulting solution.

Graph theory is probably the most fulfilling math application in the computer science. In a way, graph-based algorithms do magic similar to AI but in a fully determined manner. If you think about it more broadly, a graph resembles a subset of a neural network but only with {0, 1} weights.


Maybe some day neural networks will so obvious and well-known to the general public that this is how we'd explain graphs to kids: imagine a NN where weights are always 0/1...


The general public believes 1/3 is smaller than 1/4.


Only 1/4 [0] of the general public believes that, but marketers get hurt at any loss of customers ...

[0] https://mises.org/mises-wire/87-statistics-are-made


Neural networks is not so complicated. They are much much simpler than it seems when you think about something as complex as intelligence. It even makes me sad that such simple things as neural networks perform such complex intellectual things...


Now we need to look up that "They're Made out of Meat" story. Here we go: https://www.mit.edu/people/dpolicar/writing/prose/text/think...


Building blocks (real ones, not metaphorical ones) are also simple.

Your brain is made of relatively simple cells. Even earthworms have neurons.

But emergent complexity of systems made of simple neurons is staggering! That's the point, I guess. Simple bricks made complex systems.


You know what did happen millions of years ago when the monkeys started to use tools.

Time of AI to know how to use tools, like a mathematical formal solver. Well, it is already done, but it is not LLM... soooo.. academics only?


If it is an open source project then why not. This gives some visibility and welcoming openness to the project where everyone can contribute.


I cannot imagine moving to Git from Mercurial. Git looks clunky from my perspective. Yes, it works too, but working with Git is a usability torture, sorry but it is true. I like some Git features better though, but not most of them.


My honest opinion is that I hate that git won, it's too complicated for no benefit with complexity I personally will never leverage as a scientist who doesn't work in large teams.

I use it for visibility and ease, that's all. Otherwise I personally dislike it.


I'm a pretty young developer and git is the only VCS I'm familiar with, and even though it has its quirks I find it quite powerful and a perfectly adequate tool for the job. In what way is Mercurial better?


IMO Mercurial is (was?) more user-friendly.

Here's a quick example: when I create a Mercurial repository Mercurial doesn't say anything, while Git yells at me that it's using "master" as its branch name but I can change it with a cryptic command. After a first commit for a file Mercurial once again doesn't say anything, while Git gives me three lines of information including the permissions for the file I just added. Editing and committing a file in Mercurial with "hg commit" yields (again) nothing, while typing "git commit" in Git let's me know that it knows there's a modification but it won't go through until I "stage my change for commit".

Now, imagine you're a new user. Mercurial just did what I asked, and it even guessed that "hg commit" should mean "commit everything that's been modified". Git, on the other hand, has yelled at me about default branch names (what's a branch?!), file permissions, and bickered about me not staging my commit (what's a stage?!!). They both did the same thing but, for a new user, Mercurial did it in a friendlier way.


Heh, I've never noticed git commit including new file permissions on commit; definitely confusing/useless. Don't think "it prints less information" in general is a particularly good argument for user-friendliness though; if anything, it's the exact opposite.

Trying out hg for the first time - "hg init; echo hello>world; hg commit" prints a "nothing changed" and I have no clue how to get it to commit my file! Whereas git says 'use "git add <file>..."', and, as that's already required for starting tracking a file in both hg and git, it's not entirely unreasonable that you'll need to do "add" upon modifications too.

So in hg you have to explicitly think about file tracking and get changes for free, whereas in git you have to explicitly think about changes and get tracking for free. Obviously I'm biased, but I think "I need to tell git what changes I want committed" is a nicer model than "I need to tell hg when it should realize a file has started existing"; the former is pretty uniformly annoying, whereas I imagine the latter quite often results in adding a file, forgetting to "hg add" it, and making a bunch of commits with changes in other files as the new file is intergrated, but never actually committing the new file itself, with zero warnings.

Git's staging/index, messy as it is (and with some utterly horrible naming), is extremely powerful, and I wouldn't accept any VCS without a sane simple equivalent. Extremely do not like that "hg commit -i", adding some parts manually, and deciding that I actually need to do something else before committing, loses all the interactive deciding I've done (maybe there's a way around that, but --help and "man hg" have zero useful info on interactive mode, not even what all the different (single-char..) actions are; granted, I don't really understand "git add -i" much either, and just use a GUI when necessary). In my git workflow I basically always have some changes that I won't want to commit in the next commit.


I think you are seeing it as a software developer as opposed to (say) a biologist on the first year of their PhD who just wants to keep their scripts safe. Mercurial's strong point (IMO) was to cater to the 90% of developers who work with two-to-three colleagues on a single branch - you could always make things more complex if needed (as evidenced by Firefox doing just fine), but the defaults were always more user-friendly than git's.

For a more time-appropriate critique, this post [1] from 2012 gives an overview of what working with Git felt like at the time when git was being popularized as an alternative to Subversion (including a frequent comment of "use Mercurial instead!"). It's also worth noting that git's error messages have become more helpful since - while the documentation for git-rebase used to be "Forward-port local commits to the updated upstream head", it now reads "Reapply commits on top of another base tip".

[1] https://stevebennett.me/2012/02/24/10-things-i-hate-about-gi...


Software developers will be the vast majority of users though, at the very least for the CLI.

Git certainly isn't anywhere close to the prettiest thing for ease-of-learning (and indeed used to be even worse), but Mercurial didn't seem particularly good either. Really for the common uses the difference is just needing to do a "git add ." before every commit, vs a "hg add ." before some.

All of my git usage has been on projects with ≤2 devs (including me; technically excluding a few largely-one-off OSS contributions of course), but I still use a good amount of local temp branches / stashes / rebasing to organize things quite often (but also have some projects where all I've ever done is "git add .; git commit -m whatever").


I love this tool, but I am pessimistic because the following is probably going to happen next:

1. At some point, the app will be covered with tonloads of ads to the point of impossibility to use

2. The core functionality will be hid behind sign-up/sign-in walls. The email addresses will be collected and then spammed to the brim

3. To add an insult to the injury, the app features will be gradually crippled unless you switch to a "Pro" plan. But you will not be able to do that efficiently as a user because you will be constantly attacked, bombarded and poisoned by ad banners and popups everywhere

4. Then, the app will start to upsell other offerings (of course, with modal banners!)

Those are cynical observations but they are 99% precise. I wish you good luck if you are among 1% elite.


This is the author's 3rd or 4th attempt to launch a product in this niche, Interactivedemo.ai (ProductHunt launch), demo.fun (link in their profile), a chrome extension with the same name (never took off), etc. I got this all from their Twitter profile

I'm a bootstrapped software developer myself, so I can't decide if I should admire this or flag this as spam. But you're probably right, we're getting a "pricing" page very soon.


I do want to charge but if you have valid use cases and will function as a beta-tester I would work out free service technically forever. I am open to discussions about this. I'm still experimenting and nothing is set in stone, but I actually quit my job a while ago, and this is what I want to do full time. I have a bit of runway.


I used LiveStation from time to time, and just for fun I was playing around with finding out when and how it employed P2P protocols. Needless to say, I never found any evidence of P2P in LiveStation. And now I know why :)

Thank you for bringing up the warm memories I thought I no longer had.


Thanks for supporting a business that was pretty cool, once. I bailed as it got into the consumer livestream space, but sometimes think about resurrecting it as a higher-quality OTT app that isn't rammed with absolute junk and ads. The work that platform did during the Arab spring was significant, and I can't honestly point to a good modern alternative today.


Do not get fooled, HN tends to artificially distort our perception of reality by manually ranking and de-ranking topics according to political and monetary agendas of the day.

A very good example of that malignant practice is this very topic which was manually de-ranked.

Those narratives are pushed by the resource owners whose wellbeing depends on disruption and hype. But stability is an enemy of disruption, and delivered functionality is the enemy of hype. So they have no choice but to de-rank the topics like that, otherwise they lose power because everybody starts to see what is hidden behind the mask.


The problem with that is going to be the same as with other similar projects: single instance only. This puts the projects into a desert valley: from one hand, they are interesting to professionals; from another, the absence of horizontal scaling makes them just a toy. Meanwhile, home users for whom single instance would be enough cannot care less about this kind of technology.


We're working on migrating our abstractions to k3s, for now the system is powered by docker compose. Wouldn't be that much work to move from DR to HA deployments. Thanks for the feedback.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: