It's a decent protocol, but it has shortcomings. I'd expect most future use cases for that kind of thing to reach for a content-defined chunking algorithm tuned towards their common file formats and sizes.
Pretty similar story in .NET. Make sure your inner loops are allocation-free, then ensure allocations are short-lived, then clean up the long tail of large allocations.
.NET is far more tolerant to high allocation traffic since its GC is generational and overall more sophisticated (even if at the cost of tail latency, although that is workload-dependent).
Doing huge allocations which go to LOH is quite punishing, but even substantial inter-generational traffic won't kill it.
You're wrong. Cows will eat meat, horses will eat meat, pigs will eat meat, chickens will eat meat, deer will eat meat. If they can get it in their mouth, they will eat it.
It's plants you need a fancy setup for. Also, the microbiome thing, surprisingly, isn't universal, a lot of animals have no stomach microbiome to speak of.
The point of eating meat is that it's easy to digest. This isn't a case where two things are difficult in different ways and you want to do the thing you're specialized in. Meat is easy to digest, and plants are hard to digest, no matter what.
I would guess that a digestive system that can extract adequate nutrition from grass will extract something of value from meat. On the other hand, if the meat content became a significant part of the diet, I can imagine it could become harmful, for example by messing with the digestive process or by delivering toxic levels of certain products.
>What is the point of eating something that is hard to process and digest and has no nutritional value for you
Wouldn't that make it dietary fiber then? What's functionally dietary fiber varies from species to species, but like with humans we eat things exactly like that for GI health. Birds of prey for instance eat casting (fur and feathers) which is functionally like dietary fiber for them where it would be unhealthy if you just gave them a steak without having them also eat the indigestible bits as they wouldn't be able to properly form and regurgitate pellets. Certain animals might not need something that functions like dietary fiber but for at least certain animals - like humans - eating certain indigestible things is important for good health.
"Better" is kind of a vague term. A more precise and limited statement is that meat has the highest protein quality index. There could be some other disadvantages, depending on your species.
Turtles are carnivores, no? They are bitey as hell. All of them. You catch them on cut bait, worms, minnows when fishing. Even the ones without very sharp mouths, like softshell turtles.
This is usually the case with C#'s equivalent as well. Enumerables and LINQ are nice options to concisely express logic, but you won't see them in hot paths.
Unfortunately, the baseline allocation cost is hard to avoid due to IEnumerable<T> being an interface which all LINQ methods return save for scalar values, and IEnumerable<T> itself returning an interface-typed IEnumerator<T>. Even with escape analysis, the iterator implementation selection logic is quite complex, which ends up being opaque to compiler so at most it can get rid of the IEnumerable<T> allocation but not the enumerator itself, and only when inlining allows so.
There are community libraries that implement similar API surface with structs that can be completely allocation-free and frequently dispatched statically.
Moreover, with the advent of `T where T : allows ref struct` you can finally write proper LINQ-like abstraction for Span<T>s, even if it's a bit less pretty. I have been playing with a small toy prototype[0] recently and it looks like this:
// Efectively C's array constant
var numbers = (ReadOnlySpan<int>)[1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
var iter = numbers
.Where((int n) => n % 2 == 0)
.Select((int n) => n * 2);
// Inspired by Zig :)
using var vec = NVec<int, Global>.Collect(iter);
The argument types for lambdas need to be provided to work around C# lacking full Hindler-Milner type inference, but this iterator expression is fully statically dispatched and monomorphized save for the lambdas themselves. Luckily, JIT can profile the exact method types passed to Funcs and perform further guarded devirtualization, putting this code painfully close to the way Rust's iterators are compiled.
At the end of the day, .NET's GC implementations can sustain 4-10x allocation throughput when compared to Go one (it's not strictly better - just different tradeoffs), with further tuning options available, so one allocation here and there is not the end of the world, and not all LINQ methods allocate in the first place, and many of them allocate very little thanks to optimizations made in that area in all recent releases.
I had to deal with the second problem in a file synchronization app. The solution was to propagate a “device id” through the request and poll/push, so the originating device could ignore changes that it originated.
To offer a counter anecdote: for a while I enjoyed reading books from the list of joint winners of the Hugo and Nebula awards[1] - and later from the list of winners for a single award (same, Hugo or Nebula).
https://humanlegion.com/hugo-award-sales-figures/ has some data for one year. "They do have an effect, but probably no more than a few thousand sales for most books, maybe over ten thousand for the luckiest, and then only in exceptional years."
I think Banks possibly just had the poor planning to die to early, there. Both due to the nationality thing (it has gotten a _bit_ less American-centric), and because it feels like his _style_ fits better with current winners than 90s winners (I was actually quite surprised to discover than Ancillary Justice was _not_ in some way a Culture homage).
Yeah, I've read a couple of more modern Hugo award winners and thought that they sucked. Maybe I just got unlucky, but it certainly didn't inspire confidence that I will enjoy reading the award recipients.
I think I've read a large proportion of recent Hugo winners, and they definitely tend to be better than average. The nominees are normally pretty good in general.
"Better than average" is a pretty low bar. I'd certainly hope that a winner of one of the more recognized SF awards would be somewhere on the upper end of the quality distribution whether or not it's a great indication for
It's a decent protocol, but it has shortcomings. I'd expect most future use cases for that kind of thing to reach for a content-defined chunking algorithm tuned towards their common file formats and sizes.