Hacker News new | past | comments | ask | show | jobs | submit | more jspdown's comments login

Can't agree more. I have the feeling I've been in a similar situation very recently and following this principle would have served me better.

That's really the piece from this article I will bring home.


Already posted recently: https://news.ycombinator.com/item?id=40937119 (1178 points)

Many interesting discussion there.


Could you elaborate on this? Why do you think it's a bad idea?


It might not be suitable for your use case but, have you tried ACME DNS challenge delegation to a different one hosted by yourself?


If you like Caddy for it's ACME capabilities, then you might enjoy Traefik as well. It supports HTTP, TLS ALPN and DNS challenges and can be configured in one line as well.


I already use it as a web server and reverse proxy so it's a better match. I've tried traefik in the past and it wasn't as simple as caddy to configure. Caddy has some well thought out magic (like creating a sane modern php config with just one line).


At first it would be a big win for companies. But as jobs gets absorbed, the productivity boost will become pointless. The pool of potential buyers will shrink because of unemployment and we will end up with the opposite of what you are describing.

I'm not saying that's what will happen, nor your version. Things are never that simple.

But shouldn't we be more precautious? And take time to understand the shift and prepare as a society for this big change? Why rushing on something that would potentially negatively affect millions people life in the hope of productivity boost?

The outcome is not all good, not all bad. We need carefulthinking and planning.


You can always make your code self explanatory, but the cost in abstraction and maintenance results sometimes a negative ROI. In such cases, writting comments is often the best solution.

There's some excitment in identifying those, and that's where I enjoy documenting the intention of my code.


>You can always make your code self explanatory

this code is done in this way because EU regulation #1374 regarding Maritime tracing.

this code seems inefficient and silly but is that way to get around a specific rendering issue in Opera Mini which needs to be supported because our project needs to support basically everything because it is a required governmental service.

This code is dealing with an issue in Safari that is supposed to be fixed by June 2024 release - please check if it is still an issue - test by doing the following...

Tldr: No, you can not always make your code self-explanatory.


The state of the web is very sad. Most people with a fiber connection don't even notice how slow it became. But when you are still on a 2Mbps connection, this is just plain horrible. I'm in this case, it's terribly painful. Because of this, I can't even consider not using an ad/tracker blocker.

Would love to see this test with Ublock origin enabled.


> Would love to see this test with unblock origin enabled.

Me too. I suspect most of this code is for user tracking and ad management.


Tracking is a bit heavy, but from what I've looked at, the app code is usually much worse. I've looked at what Instagram and JIRA ship during the initial load and it's kinda crazy.


> But when you are still on a 2Mbps connection

What happens when you use modern apps on iphone 3 or first nexus phone? I don't understand, do people think that with better, faster computers and network speed we should focus on smaller and smaller apps and websites?


Your iPhone CPU doesn't suddenly become an iPhone 3G CPU sometimes, but network availability does vary a lot.

You may also one day find yourself on a flaky 3G connection needing access to some web app that first loads twenty megabytes of junk before showing the 1 kB of data you need, and then it's clearer what the problem is here.


new raspberry pi compete with smartphones of the past, which then have comparable compute than servers of the yester years. moore's law has allowed developers to push more and act fast at the cost of being optimal. many such cases.


Most people with a fiber connection? I bet a lot of people with money don't have a fiber connection, certainly not most people here on HN.


read it as "of the people who have a fiber connection, most..."


Yeah, I'm saying the relevance of that statement is pretty low because most of us don't experience that, certainly not enough to tip the needle of JS culture.


I'm leaving in France, not off-grid and not very far from a big city. We still don't have fiber connection available yet. What I'm saying is that we are a lot in this situation, even in developed countries. Those that accept shipping 10mb bundle clearly forget that not everyone have the same connection they have in their office.

To the point where the web is mostly unusable if you don't disable ads. I'm not against ads, but the cost is just to high for my day to day use of internet.


Incremental forward-only migrations (non-state based). Then, for the How and When, it mostly depends of your constraints and sizes. There's no silver bullet, it's hard, it require constant thinking, it's a slow and often multi step process.

I never saw a successful fully automated one-way-of-doing process.


Are you talking about the mechanics? Like more than just run a migration script on boot?


For local development, I cannot recommend lnav[1] enough. Discovering this tool was a game changer in my day to day life. Adding comments, filtering in/out, prettify and analyse distribution is hard to live without now.

I don't think a browser tool would fit in my workflow. I need to pipe the output to the tool.

[1] https://lnav.org/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: