Hacker News new | past | comments | ask | show | jobs | submit | more drpixie's comments login

If you're going to make a big claim about sort speed, tell me how speed is better/worse for various data. How do the algorithms compare when the data is already ordered, when it's almost (but not quite) already ordered, when it's largely ordered, when it's completely random, and it's in the opposite order. This stuff, as well as the size of the dataset, is what we need to know in practice.


Rather than, or at least in addition to, raw measured speed on a specific piece of hardware, which is often affected in hard to understand ways by niche optimisation choices and nuances of specific hardware, I actually really like the choice to report how many comparisons your algorithm needed.

For example Rust's current unstable sort takes ~24 comparisons on average for each of 10^7 truly random elements to sort them all, but if instead all those elements are chosen (still at random) from only 20 possibilities, it only needs a bit more than five comparisons regardless of whether there are 10^3 elements or 10^7.

Unlike "On this Intel i6-9402J Multi Wazoo (Bonk Nugget Edition) here are my numbers" which is not very useful unless you also have the i6-9402J in that specific edition, these comparison counts get to a more fundamental property of the algorithm that transcends micro architecture quirks which will not matter next year.


"My boss wants to buy systems with the Intel i10-101010F Medium Core Platinum (with rowhammer & Sonic & Knuckles), can you buy this $20,000 box and test your program so I can write him a report?"



What's your point? The paper you're linking does not include the analysis the post you're responding to is asking for.


It does give some insight into what you seek, at least. For example, “We find that for smallern≲262144, JesseSort is slower than Python’s default sort.”

I’d like to see a much larger n but the charts in the research paper aren’t really selling JesseSort. I think as more and more “sorts” come out, they all get more niche. JesseSort might be good for a particular dataset size and ordering/randomness but from what I see, we shouldn’t be replacing the default Python sorting algorithm.


Yup. There must have been any number of photos taken by chase planes during development.


The question is, how many were taken at supersonic speed? What non-military aircraft could keep up with the Concorde at Mach-2? Only another Concorde.


Concorde development was a (UK & France) national project. They would have had easy access to military aircraft. Aircraft like the Lightning might only just have been able to intercept but would easily have observed pre-arranged tests.

I wasn't on the engineering team ;) but apparently they planned 4000 hours of test flights. https://web.archive.org/web/20150316210132/http://aviationwe...

It's almost inconceivable that the test flights would not have been closely recorded, especially the significant ones including trans-sonic and supersonic ops. Despite the best design and air-tunnel work, you'd expect that things would go wrong and you really want to learn as much as possible from any incidents/events.

Unfortunately, all this happened well before the internet age, and so records and images are not so easily found :(


Bit of an aside, but the "Infinite Noise TRNG" seems to generate not very random "raw" data, which it hashes to make it appear as random bits.

Am I missing something, or wouldn't it be better to start with highly random raw data, and hash that to get more bits-per-second?


Seems like a fashion thing, but the Linux distribs I've recently checked out all defaulted to a dark mode. Fine for night but a pain to read normally :(


> and purchased box fans because servers were literally catching on fire

Ah yes, or a collection of R2D2 portable air conditioners, with the tails draped out through the window.

Or a coolant leak that no one noticed until the sub-floor was completely full and the floor panels started to float!


> Everywhere a C constant-expression appears in the C grammar the compiler should be able to execute functions at compile time, too, as long as the functions do not do things like I/O, access mutable global variables, make system calls, etc.

That one is easily broken. Pick a function that runs for a lloooonngg time...

  int busybeaver(int n) {...}    // pure function returns max lifetime of n state busy beaver machine

  int x = busybeaver(99);


Ada has had something similar and very flexible since from the 80s ... like:

  loop
    Get(Current_Character);
  exit when Current_Character = '*';
    Echo(Current_Character);
  end loop;
There's not that much new under the prog lang sun :(


    do {
        Get(Current_Character);
        if (Current_Character == '*') break;
        print(Current_Character);
    } while (true);
I don't see why this needs a new construct in languages that don't already have it. It's just syntactic sugar that doesn't actually save any work. The one with the specialized construct isn't really any shorter and looks pretty much the same. Both have exactly one line in the middle denoting the split. And both lines look really similar anyway.


>>I don't force you to use SSL/TLS to connect here. Use it if you want, but if you can't, hey, that's fine, too.

She accepts http AND https requests. So it's your choice, you want to know who you're talking to, or you want speed :)


Absolutely - current windows is truly horrible, ads, ads, ads, crash :( If I were designing something to be deliberately distracting, annoying, and confusing, it would look a lot like windows!


Re the need for anonymous papers, see: https://en.wikipedia.org/wiki/Journal_of_Controversial_Ideas and https://journalofcontroversialideas.org/ (the JCI tends to the arts and philosophy).


As I recall, JCRI does know the identity of each author.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: