> Production of low-margin processors, such as those used to weigh clothes in a washing machine or toast bread in a smart toaster, has also been hit. While most retailers are still able to get their hands on these products at the moment, they may face issues in the months ahead.
I understand why the new advanced chips could face shortages, but why are there shortages for these basic chips. Can’t they be made anywhere, and more easily?
> why are there shortages for these basic chips? Can’t they be made anywhere, and more easily?
Not really. Semiconductor fabs are built around "tools" from manufacturers like AMAT and Nikon. Those tool vendors make most of their money from selling new tools for fancy new processes, not supporting 20-year-old stuff. Eventually stuff breaks, and fabs have to offline these older processes.
The way this works in the tech industry is that "chips" are actually software, so if your old manufacturer isn't keeping up you resynthesize your VHDL or Verilog for a new fab, rev your board design or whatever, and keep going.
But other industries aren't so agile. They have older designs without design teams to support them, or even chip designs that they retain only as masks and not HDL. Those parts don't port cleanly to newer high-volume logic.
But lifetime support doesn’t help if the parts for your machine aren’t available anymore. If your 20 year old machine breaks and there aren’t parts available to fix it, you might get offered an equivalent replacement. If your old chip masks are incompatible with the replacement machine, you’re not immediately able to make what you need. So for some companies, having lifetime support might not help with the manufacturing slowdown when an old machine breaks.
I can't vouch for the parts, but they supported machines from the eighties. This is not some cheap consumer product that has half-lifetime of 13 months.
Underrated comment. Though the "worth it" bit is the trick.
In my estimation these older parts that "just werk" should be getting inherited and iterated on as a public good.
The idea that means of production should phase into public trust tends to get everyone in a tizzy though. I'd like to see a public "foundry of last resort" that focuses on being able to make anything.
The first is a statement of the third law of thermodynamics. The second clause is just obviously true. Go call up Fujitsu and try to order more of a chip they made for you in 1.5um in 1988.
Yes of course the equipment breaks down but older equipment is easy to repair. It is very rare for a fab to be decommissioned and the equipment scrapped - in fact I have never heard of this happening to any production facility with 6” or larger wafers. That equipment will go to de-bottlenecking at some other fab and net production capacity for the node will increase.
Obviously many very old chips are out of production but not because the equipment broke down and was never repaired.
The corrollary to your point then is that all these fabs have immense idle capacity of exiting installed tools which they aren't using but retain simply because nothing ever "broke down"? Obviously that's ridiculous.
You're interpreting me pedantically while actually agreeing with my point, I think. Old processes don't have the capacity they used to[1]. If you don't like "stuff breaks" then how about "eventually the ROI on the equipment goes negative relative to the business so the line is idled and the fab real estate repurposed to make more profitable modern stuff." OK?
[1] Which, again, is just a "duh" kind of point and I can't believe we're arguing about it.
It’s not just about supporting new processes. Many tool/machine vendors are backlogged by years because they simply don’t have the capacity to make more than a few of those machines every year. Even if someone wanted to invest in new manufacturing, they would likely have to wait a few years to start production.
Secondly, some legacy manufacturers of semiconductor parts lost money on their capacity-building investments during the dot-com burst. The semiconductor industry is brutal and there is a genuine fear that overcapacity will make it hard to deal with any bust that happens after this boom.
Correct me if I'm wrong, it seems like in the modern economy you have a situation where for many companies, most spending decisions are made long term, on the principle of what promises profits long term, regardless of immediate factor and with the perspective not overcompensating.
This has all sorts of bizarre consequences. In the middle of the PPE shortage - hospitals prevented their employees from buying PPEs themselves but would still only buy PPEs at the lowest price with a long term contract. And you had the Texas company that loudly proclaimed they couldn't sell their PPEs but they also only sold by long term contract. And this was all with people dying.
It's easy to see how manufacturer isn't going to be adding capacity for a puny short-term shortage.
I don't know but maybe one of the factors is that given how cheap microntrollers have become it's not uncommon to use an "overpowered" integrated chip just for ease of development. Suppose that you have to drive some LEDs on a washing machine, do you bother developing some optimized bespoke circuitry with discrete components or do you just slap a ~2$ 100+MHz 32bit Cortex controller that will let you implement all the logic in C and just reflash if you find an issue?
It makes me wonder if it would be possible to build a chip-manufacturing plant for any reasonable amount of money to produce these chips that don't need to be 7nm GPU powerhouses, but like the old clunker chips that can't get attention from the big guys.
Almost like starting a "generics" business in pharma medication but for older chipsets.
I'm sure there's a great trade to be had in producing the lower end stuff.
> It makes me wonder if it would be possible to build a chip-manufacturing plant for any reasonable amount of money to produce these chips that don't need to be 7nm GPU powerhouses, but like the old clunker chips that can't get attention from the big guys.
> Almost like starting a "generics" business in pharma medication but for older chipsets.
There is actually a great interest in this business, but mainly from Chinese. World's biggest 200mm fab is in Shanghai. A decision to build a brand new 200mm fab would've never flew in the West.
Chinese 3rd-4th-n-th tier fabs been vacuuming the market for old equipment for last 5 years.
> I'm sure there's a great trade to be had in producing the lower end stuff.
At this very moment, production on 150mm-200mm wafers is actually few times more profitable than on the latest process because everybody is now ready to pay absolutely ridiculous premiums.
Most of the chips in these shortages are being produced on either older process nodes, or on slightly specialized nodes. The typical micro that's been hit by this is using anywhere from a 28nm to 180nm node.
The trouble is, this is a temporary shortage, so it makes no sense to spend serious cash (you're talking hundreds of millions) to make a new fab when the demand won't be there in a year or two.
I wonder if there's a good business in the mix of these ideas. If a lot of manufacturers actually are using over powered chips because they are a) more available and b) easier to program with newer tooling then one might be able to find a niche making cheaper/simpler/older style chips if they also provided modern tooling making it easier to program them for simple tasks like weighing things, blinking lights, playing little tunes, reading a sensor, etc. I've heard good things about PlatformIO so leveraging that ecosystem could be a win as far as avoiding creating your own IDE. Producing great documentation for the products would also go a long way towards gaining adoption.
Tons of chips still made at >130nm, and 200mm equipment for simple reasons that companies don't make much money, or not having much volume in this stuff.
> The trouble is, this is a temporary shortage, so it makes no sense to spend serious cash (you're talking hundreds of millions) to make a new fab when the demand won't be there in a year or two.
While true, one could say it’s a bet on inflation to borrow dollars now for productive assets.
The problem with your idea is that you are competing against obsolete high-end fabs, which have already paid back all their capital costs long ago. In a normal market, it's pretty much impossible for you to match them in price if you still need to pay yours.
Still, GloFo basically made this their plan, when they pivoted from the very highest-end chipmaking into FD-SOI, which is less performant but cheaper to design for.
130nm is plenty ancient; it's the same feature size as a >10-year-old STM32F1, I think. And I hear that those MPW runs are starting to accept ~$10K for a guaranteed spot with a closed-source design.
So you'd probably be looking at charging 6 figures per wafer. I don't have good insight into startup costs, but I would guess high 8-low 10 figures. Running costs would not be negligible either.
Is that possible? I haven't crunched the numbers and I don't have enough information or context to do so accurately. But my gut says that it might depend on how many billionaires you're on good terms with.
> Is that possible? I haven't crunched the numbers and I don't have enough information or context to do so accurately. But my gut says that it might depend on how many billionaires you're on good terms with.
130nm is quite ancient, but there are digital parts from early nineties still on the market. They are way bigger than 130nm.
Right now I have an ongoing project with a company making aircons. Their kit supplier uses a really, really ancient, and rare Hitachi MCU made on 600nm, and they are paying few dollars for it — more than some modern ARM SoCs.
They really want to change their kit supplier, or compel the chip supplier to cut cost, but the kit supplier itself can't migrate from Hitachi MCU because they don't have firmware sources as they themselves only copypasted the firmware as a binary for decades..
> but the kit supplier itself can't migrate from Hitachi MCU because they don't have firmware sources as they themselves only copypasted the firmware as a binary for decades..
That’s seems like a rather existential problem. If I’m understanding correctly, the kit supplier makes the control board and the manufacturer does final assembly?
Yes, and the Chinese kit supplier seemingly got the tech from a Japanese aircon maker somewhere in nineties, and then copied the board verbatim ever since.
This is a weird comment. One of the key features of the SuperH ISA is that it's more or less backwards compatible. I worked on them in the 00s/10s, but I can't imagine they had an entirely different ISA in the 90s. I also know that commercial SH3 emulators exist because I've used them. Heck, Renesas used to ship one with the toolchain.
I'm surprised to hear this, maybe trying to salvage the old binary might have had some sense, but the client already went for a complete re-engineering, recognising the low availability of this rare chip as a great threat.
I think that's exactly what some of the old fabs are doing.
When a new process node comes out not all fabs are immediately upgraded. Fabs with older tech simply start producing simpler chips while the new ones pump out cutting edge ones.
With the amount of horrible infotainment systems in the wild i honestly doubt they’re using overpowered chips. I’m sure any consumer grade APU (ie. CPU with an iGPU) from the past 5 years would do better than the chips currently in cars.
I've worked in that industry. The problem with infotainment systems (be it in planes or in cars) is that they're usually designed years before the planes/cars enter production, they have very strong constraints in terms of price and component choice (you need automotive-certified parts, not smartphone parts, and they need to last a long time even if they have to go through Arizona summers) so they're already outdated by the time the car comes out.
These systems are also usually integrated with other systems to provide additional functionality using largely custom code that somewhat prevents quick iteration and code reuse, especially since the people writing the code are largely not in-house but various contractors (that's where a company like Tesla has the upper hand since I suppose that they control the software stack a lot more than the average).
Beyond that these systems suffer heavily from design-by-committee and worse yet, committees whose core competence really isn't computer UI.
I think the panels and digitizer used in automotive applications are pretty specialized and relatively expensive. They have environmental requirements that far surpass that of typical consumer products.
Eh, no, that's not how it works in high-volume manufacturing. There are 70 million washing machines sold per year. Suppose your large conglomerate employer sells 0.7% of that total, or 700,000 units. It doesn't take much of a per-unit savings to pay for the salary of a FTE to optimize the design.
Maybe, you have to see if the cost of having independent components (dev time, prototyping etc...) is worth the few cents saved on the BoM.
Then you have to consider that IC designs are usually easier to reuse since they're more flexible, if you can have a single design with different firmwares for your entire line of products vs custom hardware for every design. Even if you sell 700k units/year you probably have a few models in your inventory, each selling for a fraction of that.
Beyond that it's pretty common for modern appliances to come with so-called "smart" features that require more processing and more IO capabilities. It's not rare for modern coffee makers to come with a color screen instead of the good old 7 segment displays.
So really the equation is not that simple, especially for higher end models that will have a more expensive BoM overall and a lower number of units sold.
It doesn't matter for this, but it's definitely the case in the hobbyist segment. Look at how many people use Raspberry Pis for things better suited to a microcontroller.
It's true that RPi are often overpowered but I'd contend that Linux is the platform being targeted more than the RPi itself. Development is much easier if you can assume a full fledged OS is running.
And even then, process nodes aren't fungible. Taping out a design for a totally new (to you) node is probably at least a year of time. And for what? Will the chip shortage be over then anyway?
This is correct. The industry is currently constrained on everything from water to wafers, in addition to fab time slots. Everyone is panic buying too, so shortages are getting amplified.
... and while there is lots of "real" demand we also get the cryptocrazies exerting additional pressure not just on the finest and best silicon available, but now even on HDD/SSDs. Prices have already risen 50+ % in the last few weeks.
Not who you responded to but newly released consoles are tons of chips, having to stay home meant more people buyer gaming PCs as well, everyone who started working from home required tons of new hardware while their desktops at work go unused, and money flowed from governments like water so everyone has money to buy all these things at once. Then on the supply side you had basically everyone stop working for at least a couple months some longer not only out of restrictions on the ability to work but restrictions that make a lot of processes much less efficient plus fear of going to work on top, then you had no one willing to return to many jobs because unemployment benefits have lasted for over a year (rather than normally a few months) and unemployment pays higher than your typical minimum wage job anyways with even the current the extra $300 per week, which was an extra $600 per week for a long time as well. This also means people that used to spend their time on higher wage and producing jobs end up spending a lot of their time doing things they would normally delegate off because no one wants to work those jobs. If you look at chips they are just one of many industries all with shortages for similar reasons, chips are just the worst shortage of all, mainly from all the work from home needs. In 2020 last year my computers power supply died.. there wasn’t one available with 100 miles. I drove to every store within about 20 miles to try to get back online same day. Even online Newegg and Amazon were all sold out, I had to spend about triple normal to by a power supply that was way too much for what I needed and pay extra for shipping it quickly. Not the same as chips but it was a similar need and far fewer people are needing power supplies versus chips.
It is a new Bitcoin for Chinese businessmen and they buy up all stocks and stockpile. If you go on Chinese sites, you can buy any chips you want even thousands of them. Of course you'll pay 10x the price and have a high chance getting a counterfeit product.
I understand why the new advanced chips could face shortages, but why are there shortages for these basic chips. Can’t they be made anywhere, and more easily?