I feel you. I've written quite a lot of Common Lisp over the years and I've tried to show off its capabilities to co-workers, often in comparison to the language used that that job.
Often I just get "huh, that's neat, now let's get back to mining the coalface".
I have a suspicion that most Lisp articles we see on HN reinforce the perception that this is something other people shouldn't take seriously, much less touch.
This particular post is better than most, and seems like genuine interest, and targeted at people already in the fold, which is fine. Though probably no one else is going see it and think "I gotta get me some of that."
The intentional advocacy posts, on the other hand... I usually don't see them appealing well to even the minority of programmers who are amenable to a low-employability platform. While they seemingly help to keep the platform low-employability, by making a weak pitch in the moment that someone was curious/bored enough to look at that link.
I think the biggest issues involved around knowledge/skills/intelligence being potentially linked to genetics is that it often draws the eugenics proponents into the discussion and it's obvious how that conversation ends.
Even in binary you can see patterns. Not saying it's perfect to show binary diffs (but it is better than showing nothing) but I know even my slow mammalian brain can spot obvious human readable characters in various binary encoding formats. If I see a few in a row which doesn't make sense, why wouldn't I poke it?
This particular file was described as an archive file with corrupted data somewhere in the middle. Assuming you wanted to scroll that far through a hexdump of it, there could be pretty much any data in there without being suspicious.
Pure speculation but my guess is a specific state actor ahem is looking for developers innocently working with open source to then strongarm them into doing stuff like this.
many people are patriots of their countries. if state agency would approach them proposing to have paid OSS work and help their country to fight terrorism/dictatorships/capitalists/whatever-they-believe, they will feel like killing two birds with one job
While this seems plausible, it is notable that this person seems to be anonymous from the get go. Most open source maintainers are proud of their work and maintain publicly available personas.
While I don't doubt there are people who would gladly do this work for money/patriotism/whatever, adding a backdoor to your own project isn't really reconcilable with the motivations behind wanting to do OSS work.
Yes but this is more general, allowing you to use it in more cases. For instance, I use it heavily when converting old JavaScript to modern ES/TS. Using "var" everywhere? Replace them all with let. Using anonymous functions instead of lambdas, easy to change those all over (provided there's no "this" dependencies)
Have two thousand strings which all need the same edits? No need to do a find/replace operation, you can do it directly in the editor.
You can absolutely use a regular expression find/replace to solve this. But using multicursors, you can just highlight the first "foo ", then hold Ctrl+D to select all instances, then hit right arrow key so that your cursors are at "foo |bar/2322" (and nothing is selected) et al, then use shift+right arrow key to select bar, baz, blah, and all other substrings, then use ctrl+X to cut that list to your clipboard. Hit delete key to get rid of the /s and add a space so you can keep the fields separated. Then, use ctrl+arrow to move your cursors to just before foo ("|foo /2322"), paste, hit space. Now you have "bar foo 2322". Repeat the same action to cut all the "foo" substrings, then move your cursor to the end, now type ": " and then paste.
You get the idea. It sounds complex, but these are all just comprised of the same fundamental editing patterns-- all of the cursors act as if you had just that one cursor when you press the keys. You have to play with multicursors to really appreciate their power.
Most of the time, someone who is well versed with multicursors and their editor's cursor shortcuts (arrow keys, page up/down, shift/ctrl arrow keys, etc) will be able to complete these sort of textual manipulations much faster than using find/replace.
It doesn't have to be identical. For the method they're describing you just have to get search results -- so you can use regex in that. You can also just command-click anywhere you want to leave a new selection-point. (Or use various other find commands that search for things and add them to your selection pool.)
The benefit of it is that you're left with a cursor in each location, and you can then do absolutely anything that you'd normally do with a single cursor in every place at once. This includes things like copy/paste, which will maintain a separate buffer in each selection. This also includes things that're actually tough to do with normal find/replace -- I could select the bit after a search result and switch it to title-case, for instance.
You can do most things you'd use it for with find/replace. But sometimes it's easier to watch it happen as you type, rather than construct a fairly complex regex with groups and suchlike.
It helps when it is a bit more interactive, i.e when I only need to replace some of the occurrences versus everything.
Also when working with lists it is useful, you spawn cursors on , or < or whatever symbol you've got at a fixed location between lines, and then you can manipulate text in any number of otherwise different lines.
I find the standard blackbox_exporter far too limiting and static so I wrote an exporter which queries DNS zones from the Google API and creates targets dynamically from that.
It also has a feature which will query internal databases to find expected targets (kinda like service discovery). This covers more specific checks than what the DNS-based targets will provide.
These together mean that essentially no endpoint in our infrastructure is missing from being monitored in some fashion.
The exporter performs SSL checks (lifetime remaining etc) as well as providing HTTP/TCP latency metrics.
Not American but I can't stand using the toilet in places which don't have bidets. I feel disgusting and it ruins my day completely. I'd rather be uncomfortable and wait til I'm home or not go out.
In an emergency, in a public restroom... after the normal TP mechanics, you can follow that up with some TP that's been wetted in the sink. Then dry TP to dry.
Optionally, the tough and brave can add a tiny bit of whatever hand soap of unknown formulation is available. You'll need additional passes of wet TP to rinse.
Not-recommended: Some people carry those supposedly 'flushable' wet wipe products, but I think those are known to cause problems for plumbing and sewage systems.
(I'd better not get hit by a bus today, because I don't want this to be my final HN comment. :)
I think this mostly comes down to whether applications can handle downtime if their workloads are restarted, scale up/down based on demand.
It happens shockingly often that applications only support working with a single replica and even worse when those applications cannot run concurrently with replicas of themselves which prevent smooth rolling updates.
IME if applications are fault tolerant of restarts, or support concurrent replicas then scaling up and down to meet demand is absolutely fine.