Hacker Newsnew | past | comments | ask | show | jobs | submit | more Delphiza's commentslogin

There are countless millions of people that are experts in certain domains that do not have the aptitude, confidence, or training to use high-code tools. It is insulting to imply that those people are not experts because they don't use a 'real programming language'. Besides, while imperative programming languages are good at solving a great many problems, they are far from being the best tool for every person, every problem, and every environment.


Connecting EV charging stations to the grid is a significant problem. Balancing the peak grid maximum demand* against EV charging demand. While it may be okay to have a busy highway/motorway service station properly hooked up, it get's more difficult to apply that more broadly. How much charging capacity should a shopping centre provide? What about a occasionally-used sports venue?

I have worked with customers that have distribution centres in city locations to charge online-shopping vehicles overnight for the next day. All vehicles are plugged in and the software needs to charge different vehicles up at different times and rates, in order to spread the load.

It is far easier to dig a hole in the ground and fill a tank with delivered fuel - all the logistics for this already exists.

* Maximum demand is a well-known concept that relates to the maximum current draw in a 30-minute period, which is used to provide the necessary infrastructure from the electricity supplier.


Because there was no practical opportunity to put in the primary hyperbole - "Game Changing!"


While governments may be interested in moving stuff off Azure/AWS/GCP, in my experience, most businesses don't care enough. It is difficult enough to get them to spend money on the bare minimum needed for security. Yes, they will want 'sovereign data' because it looks good on the annual report, but most would not do it for double the cost.

Europe and other parts of the world should reduce their dependency on American tech, or any other tech that locks them in - regardless of it is geopolitical or something else (SAP?). They're not going to be able to do it soon, so will wait out the next four years. Governments do have a serious problem though, but are going to struggle to get the investment together.


SAP is German. "Largest non-US software company by revenue", says wikipedia. Doesn't make it suck less, though.

Re: investment, there's long been a solid set of EU open source contributors, from Torvalds himself through SUSE etc. I think if we identified the right 100 people and gave them money and a sense of civic mission, the EU could achieve a lot in this area. However, the EU's own anti-subsidy rules trip it up here.

The history of Munich is very interesting here: https://www.zdnet.com/article/linux-not-windows-why-munich-i...

Of course it's also political. If you want an alternative to US megacorps you have to put up with reds and greens.


The inclusion of SAP, as a German company, was intentional. Lock-in is not just American, and these days it is extremely difficult to have independence from large tech companies. It is not impossible, but requires significant government investment, and a desire for businesses to move. I don't see European businesses moving away from Microsoft and Google email services in big enough numbers to create a sustainable market for alternatives.


European tarrifs on AWS/Azure/GCP would be an option. European tech companies expanding into SAAS is less of a mountain than physical production.


"EU could tax big tech if Trump trade talks fail, Von der Leyen tells FT" - https://www.reuters.com/technology/eu-could-tax-big-tech-if-...


Moving the low-cost jobs offshore was fine until automation filled a lot of those jobs. Now the high skilled automation skills and infrastructure (production lines and robots) are also offshore. I have done my fair share of western factory tours and the number of people on the factory floor is soberingly low... they are simply not needed, as they line runs like a vast, complicated machine.


Japan led in automation in the 90s before the rise of China put a stop to those investments paying off. Now China is making those same investments at a time when the tech is much better. America could solve its manufacturing problem in the future just by importing China automation tech.


>"America could solve its manufacturing problem in the future just by importing China automation tech."

Assuming there is no embargo by then.


It is theoretical, I also doubt China would help the USA develop like that directly and lose its advantage. We would have to trade something really valuable in return (like modern semiconductor or jet turbine tech).


As part of our IoT offerings, we tried quite hard to build devices with mobile connectivity about 7 years ago. This was just as low power 5G, NB-IoT and similar technologies were going to become a thing. We gave up because it was too much effort for little return, and it was better to focus on doing things with data, rather than collecting it. Even getting prototypes up with powered fanless PCs (SBCs) and Mini-PCIe or M.2 modems was harder than it should have been - you would think it would be easy with off-the-shelf devices and drivers.

Small Internet connected devices are still needed, despite the perception that IoT is dead. Mobile networks and the modem supply chain are definitely holding the market back. On the plus side, for stationary powered devices most people are happy to connect to wi-fi. For low power devices, LoRa, with private gateways, seems to be a standard. Mobile that is used in outdoor vehicle and asset tracking is still stuck with fighting with modems as per OP.


You might want to take another look, I went through a similar process a couple of years ago, and am now doing it again because our original equipment supplier decided they'd like a life (RIP PC Engines). There's a thriving industry of companies providing fanless ARM based machines either shipping with LTE modems, or with M2 & SIM slots on the board to provide your own.

We install in industrial environments where an accessible internet connection is far from a given, having LTE on all our devices means that we can almost always give the device a way to call home. I can strongly recommend Compulab's devices, which you can purchase as a fully assembled unit that just needs a SIM card put in it (I can't recommend the Linux distribution they run, but you're free to flash them with whatever you'd like).


> There's a thriving industry of companies providing fanless ARM based machines either shipping with LTE modems, or with M2 & SIM slots on the board to provide your own

When I did this about 10 years ago we had quite a number to pick from. Build your own was possible. But that was only because of our organization had the capability and expertise to do it. We settled on 3 off the shelf ones. That was dependent on cost and number of I/O the customer was needing.

DiY basically is 'first make your own computer with the ARM/MIPS/x86 chipsets' then 'spin your own special firmware for it'. Then 'build your own ground up linux distro or similar with compiler chain and SDK to work with it'. You may be able to get someone to sell/give you a reference layout/SDK. Then after all of that. You are now ready to add in a modem. Also prepare for the certifications of all the mobile networks you want to run on. Plus software for you to interact with the cell modem. Oh also you need to work on getting yourself provisioned correctly in the mobile networks. When you do for five devices it is a couple hours of playing with an API. But you probably want hundreds of them so be prepared for managing that, plus billing. Oh also you need to manage EoL for your parts. Many IoT installations are looking at you hanging around for 10+ years.


> Also prepare for the certifications of all the mobile networks you want to run on

AFAIK the modem maker actually does this. You would rather check that the chip is certified for what you need before you buy.

> also you need to work on getting yourself provisioned correctly in the mobile networks

AFAIK also, this is a service provided by "virtual operators", companies that pass deals with the main operators and provide advanced services for machine-to-machine uses, for instance.

But perhaps these didn't exist 10 years ago. Things really started to move where I operate when they EoL'd copper wires.


Yeah, this is all true in my experience. We buy devices with the appropriate Quectel modem for the region, stick a SIM card in from a provider who has relationships with operators worldwide, then ship it to site. Once it gets given some power the device will connect to whatever mobile network has best signal and call home - we don't actually do a lot of international deployments but even within the country it's hugely useful to have SIMs that will happily roam between all the major operators.


> I can strongly recommend Compulab

Thanks for the rec. This board looks promising, including Debian/Yocto support, and they apparently have an option for hardware customization, https://www.compulab.com/products/sbcs/sbc-iot-imx8plus-nxp-...


Thanks... I'll have another look. I always assumed that device/modem supply would catch up, but they always seemed held back by the established mobile network operators. Also, covid-related supply-chain issues stopped a lot of products in their tracks. We would waited up to a year for modems in some cases. I suppose that has all flushed through the system by now.


Seconding this about Compulab. If they made a box with a Marvell CN9130 it would be really nice.


Agreed, would be good to have an alternative to Solidrun.

On the consumer side, QNAP offers both a NAS and network switch based on Marvell CN9130.


Why mobile connectivity instead of Bluetooth?

As an end user, I prefer my devices use Bluetooth and a hub device like a phone or laptop, rather than each one have a direct mobile connection.


Because these aren't regular consumer gadgets. Think more like "lamppost mounted traffic counter", Bluetooth connectivity to a smartphone would be completely useless.


That changes the situation a little bit, but could a bluetooth mesh network of lamppost mounted traffic counters work just as well? Without the hassle of giving them all direct internet connections?

I've been playing with bluetooth and looking into the mesh networking a little bit lately, so I'm curious the trade offs and decision making that goes into a real product.


Heh, thanks for that oddly-specific example that is an exact device that I worked on a few years back :)


As a first order of business, a sufficiently advanced AGI would recommend that we stop restructuring and changing to a new ERP every time an acquisition is made or the CFO changes, and to stop allowing everyone to have their own version of the truth in excel.

As long as we have complex manual processes that even the people following them can barely remember the reason why they exist, we will never be able to get AI to smooth it over. It is horrendously difficult for real people to figure out what to put in a TPS report. The systems that you refer to need to be engineered out or organisations first. You don't need AI for that, but getting rid of millions of excel files is needed before AI can work.


I dont know that getting rid of those wacky Excel sheets is a prerequisite to having AI work. We already have people like Automation Anywhere watching people hand carve their TPS reports so that they can be recreated mechanistically. Its a short step from feeding the steps to a task runner to feeding them to the AI agent.

Paradigm shifts in the technology do not generally occur simultaneously with changes in how we organize the work to be done. It takes a few years before the new paradigm backs into the workflow and changes it. Lift and shift was the path for several years before cloud native became a thing, for example. People used iPhone to call taxi companies, etc.

It would be a shame to not take the opportunity to tear down some old processes, but, equally, sometimes Chesterton's fence is there for good reason.


That is a good point that US-based developers may not be aware of. The EU CRA (Cyber Resilience Act) mandates "an obligation to provide duty of care for the entire lifecycle of such products", mostly by requiring updates for security vulnerabilities. Any software that connects to the network (or Internet) has to be assumed to have a vector for vulnerability at some point in the future. This means that it has to be updatable, and cannot be a perpetual license.


A lot of software has no need to connect to the Internet though, sounds like a "loophole" ?


What is often missed in Y2K mitigation is just how much software was discarded in favour of something more modern.

I had a happy customer that was using software that I built in the early 90s. I was asked to 'Y2K certify' it, which I couldn't do for free, so they had to ditch it before Y2K. Even though I would have used proper datetimes, there may have been a couple of places where there were problems, such as printed reports or displaying dates on the screen. I certainly couldn't underwrite that it would work without reviewing it extensively.

Apart from cobol/mainframe projects to fix bugs, a significant part of Y2K preparation was throwing out whatever cobbled-together software businesses had, and replacing it with SAP. Indeed, a large part of the SAP sales-pitch in the mid-nineties was 'Y2K ready'. The number of SAP licences sold in the mid-late nineties is useful data for Y2K mitigation that was often overlooked.

It was highly likely that various applications within any particular business had Y2K problems, as with my little application. If they didn't, you get the original vendor to certify them as being 'Y2K ready'. For many people involved, it was quicker, cheaper, easier, and less risky to replace with SAP, rather than review and fix everything.

I am convinced that ERP (SAP, Oracle) and CRM (remember Siebel?) were given major boosts in adoption due to Y2K alone.


By what definition of 'hype'? It was absolutely a significant problem that needed to be 'hyped' in order to get people involved and businesses to spend money. I was involved in the periphery of Y2K, and saw significant budgets being released as a result of the hype. The OPs point is that because those budgets were spent (on the right things, hopefully), the failures did not occur, which means that some hype was necessary.


Maybe you don't remember but the news at the time was talking about it as if it was going to be a nuclear apocalypse or something.

It's annoyingly hard to search for old newspapers & TV news (does anyone know of a good free site for that?). The first thing I found was this Y2K survival kit:

https://www.etsy.com/uk/listing/1824776914/vintage-y2k-survi...

> Futurists predict that riots will sweep through cities all around the world after Y2K knocks out electricity and destroys banking and commerce systems.

That's the sort of nonsense that was "promised", and to a large extent, exploited by Y2K consultants.

So when people say "it wasn't a problem", they aren't saying "you didn't have to do some IT work to make sure your phone system didn't go down" or whatever. They mean there were no planes falling out of the sky, trains crashing, riots & looting, etc.


The year2000(dot)com was the clearing house for much of the non-hypearticles from IT folks.

The way back machine has it archived. I’d suggest looking at the copies of the site towards the end of 1999. A good example of such an article was ‘a crock of clocks’

If you wanted to hear my take on y2k 20 years later? Check the podcast: y2k an autobiography

Cheers


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: