Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Having worked in EDA in the past it's kind of interesting that inside of the big 3 EDA companies (Cadence, Synopsys and Mentor Graphics) the culture seems to be very averse to change. For example, where I was working we had a C++ codebase that was mostly from the 90s. C++ templates and STL were verboten, they had their own non-templated containers from the 90s (this was in the 2006-2012 timeframe). We were using a version of gcc that was always about 7 years old. Similar for the ancient versions of RedHat we were running. Given how important EDA is to chip design you'd think there'd be more openness to innovation. The other thing that was frustrating was how siloed things were. There were software components that I'm sure other groups within the company had developed, but we were never encouraged to reach out to find out what might already be available - in fact it seemed to be discouraged. There wasn't some kind of central code repo within the company where we might look for such things - each group had their own repos that weren't accessible from outside the group.

The culture had a sort of "old boys club" feel. A lot of the folks working there had been there for 15 to 20 years or more - maybe this was what contributed to that, I'm not sure.

At any rate, EDA is a very interesting area to work in if you're a software engineer because there's a lot of advanced algorithmic work to do that you don't get much of working in other industries. But the way they approached that work was just not very interesting (old tools, old methodologies, etc). I suspect things are different in EDA startups, but unfortunately, they tend to be perpetually starved for cash - it's not an area that most VCs are familiar with; the time to payoff is much longer than for web startups. What tends to happen is that a startup EDA company will develop a product that falls outside of (but complementing) what's currently offered by big EDA and then they will try to get the attention of Big EDA and get bought by one of the Big 3 - that's the measure of success, not going public as would be in other industries because it's been extremely rare for EDA startups to go public since the big 3 have become established.



> Given how important EDA is to chip design you'd think there'd be more openness to innovation.

Hypothesis: the importance of EDA to chip design makes it change-averse.

I once had to chase down a delta between two triangle mesh files that resulted from two different builds of the same compiler, running on two slightly different hardware configurations, compiling the same source into different floating-point code that calculated a result of -0 instead of 0, which resulted in a no-difference rotation on the indices of every vertex on every triangle in the entire file.

Given that it's basically impossible to mathematically validate C++ code, if any change could cause a difference in output in a million-transistor layout, how would an engineer confirm that difference didn't matter? That kind of problem drives the cost way up for changes to the codebase.


Physicist here, been with the same company for almost 25 years, developing measurement equipment. We have a lot of older scientists and engineers. My impression is that there's no correlation between age and resistance to innovation. Some of the oldest engineers are the ones pushing the hardest for trying new things.

Other possible explanations for the longevity of workers include: 1) It's rewarding, interesting work. If it's not directly improving peoples' lives, at least it's improving something. 2) It's based on principles that are hard to learn and don't change very rapidly, such as math and physics. The engineers who are given the hardest math and theory problems to solve tend to have the grayest hair. 3) The people who know the value of their domain knowledge and experience are also capable of controlling the pace of work so they don't burn out.


Remember, the very best machine learning models are distilled from postdoc tears.


Rest assured that one of the companies you mentioned has deployed a catalog of reusable software components not long ago. It's already quite populated. But the funny thing is, you need VP approval to submit your modules.

I agree that EDA companies are unnecessarily siloed and dated in many places.

Weren't it for the moat of hundreds of PhD's working on the algorithms, it'd have been disrupted to hell by startups.


EDA software provides the infrastructure for projects with very high NRE and with low tolerance for failure (you can't just recompile and deploy a new chip design for free when you've already built a million bricks). Their entire business model is built around risk avoidance, so it would require an enormous improvement in efficiency for a client to take a risk on a "disrupting" startup's tools.


I used to work in EDA/Semi-IP as well. In addition to the common argument that tool changes are avoided to mitigate risk because of expensive iteration costs, another rationale is that the design and manufacturing costs of making a chip are so high that integrators are less fixated on switching tools for savings because they are not that significant to the overall budget.


Yes, BUT... Each new technology node (excluding shrinks) already disrupts the flow in many breaking ways. Seemingly small changes, new DRC requirements, the need to model more side effects all cooperate to make porting to a new node all but easy.


And you can't just upgrade the stack every 2 weeks during a 2 year project. There is a infamous example of this in the mechanical cad world with airbus and catia where they had to rebuild half a plane in a different version of cad


I actually looked into these.

There's a tradeoff between software skills and algorithmic skills. These guys are really really serious about algorithms, it figures their software isn't as good, none of that recent fancy shit, git at best. And it has to be PERFECT within its context, no fucking up or patching the fuckups with an if-else for that bug.


> Rest assured that one of the companies you mentioned has deployed a catalog of reusable software components not long ago. It's already quite populated. But the funny thing is, you need VP approval to submit your modules.

LoL this seems par for the course.


And the fact that billions of investment in making chips that work is at stake.

That's a really, really, REALLY wide moat.


I've used Cadence OrCAD PSpice for mixed A/D simulations and not only feel really dated, it's relatively slow and it's also pretty buggy (a lot of mid simulation messages like "The program need failed and need to be closed"; Some project corruption that makes you create another one and add the files manually; Some analogue circuits that never converge, like oscillators that doesn't oscillate, but in other software and in a real circuit does).

LTSpice looks dated, but at least the developer takes the simulation speed and the convergence to solve the circuits as a really important matter (specially compared to OrCAD PSpice): It can use multiple cores, it has 64-bit compiled binaries, and the developer uses a lot of tricks to get the maximum performance on every machine it runs.

Without knowing how is Cadence Virtuoso and other more specific products, I don't imagine myself paying what this software does and having those problems. They aren't bad or they aren't improving, for sure, but I feel those companies are so specific that they simply don't care.


The risk averse nature of the semiconductor industry comes from it's financial scale and speed. If there's a mistake, re-spinning the chip (even just the metal layer) is costly both in time and financially. No self respecting semiconductor company wants to be the guinea pig for an entirely new tool.

That being said, better tools for design and verification would still win out since developer time is more valuable/scarce than computational time. However, hardware tends to differs from software in that the computational requirements for designing and verifying a chip design are far greater than the average product in the software industry.

Literally everyone I know in my industry hates the current state of EDA tools, and fees like they're stuck in the past, yet most of them defend the use of Perforce over git, Cron over Jenkins, XML over JSON, TCL over Python, Cisco Any-connect over Tailscale, and running their remote terminals in a VNC session instead of using SSH like normal people. I feel like the pot is calling the kettle black here.


The change from proprietary scripting like dc-shell script to TCL a quarter century ago was a rough one, let’s not be too rushed with the next big change..

On the other hand, users of software compilers are starting to talk about link time optimisation and guided optimisation, techniques that I had much more control with dc-shell than any modern software tool chain.


The 3D CAD industry is very similar. People with 20-40 years of tenure. Old C, Lisp, etc. codebases.

Has it occured to you that maybe their field is just so much more complicated than the run of the mill web crap we do, that they just can’t be bothered to care about which language their tooling is written in, or which line the braces should go?


The deeper you go into the tech stack, the slower things move.

Frontend frameworks move faster than backend frameworks, which move faster than programming languages which move faster than operating systems which move faster than processor architectures which move faster than the technology used to create them.


Linux moves faster than many web frameworks imo.


But not the interface exposed to the developer, or its coding standards, or best practices.


This is an interesting comment but I’m not sure it stands up to closer examination. For example, ISA’s change very slowly - more so than aspects of fabrication.

I’d argue that the parts of the stack that connect with other levels move more slowly than those that don’t. They need to be stable so that the whole stack continues to work together.


I've worked for one of the big three you mentioned. Probably a different one, or at least a different group, since the culture seemed fine to me.

I can't think of single a time where old code or tools were a barrier to innovation. We updated when there was a real need, but otherwise left working code alone to focus on adding innovative engineering features. The code was so heterogenous that it was modular by default, so the pain of one library's tooling was limited to that one library.


I dabbled a bit in FPGA's a few years ago. My conclusion is that the whole digital circuit design industry is ripe for disruption by open source, and I'm rather surprised it hasn't happened yet.

There might be money to be made starting a startup that starts the revolution, but I don't know enough about the industry to figure out any business plan more detailed than that.


> My conclusion is that the whole digital circuit design industry is ripe for disruption by open source...

I've seen this claim many times before, but I never see any proposals about exactly how that would happen.

All open source FPGA tools that exist today are good enough for fun hobby projects (I use them every once in a while), but none of them have the additional features that are needed in the real world.

When pressed about it, people point to 50GB downloads, bad GUIs, and bugs in the tooling. Not that these aren't issues, but if that's all you can point to, then I don't see where the disruption will come from? In order to have that many bugs and 50GB of tools download, you need tons of code to begin with, and the functionality that it implements.

Right now, that's not there, and it very likely never will. It's such a niche domain.


One notable exception is Verilator which is growing fast and competes welll with commercial Verilog simulators (https://github.com/verilator/verilator) It has industry support notably Wester Digital.


> My conclusion is that the whole digital circuit design industry is ripe for disruption by open source, and I'm rather surprised it hasn't happened yet.

Just build a modern design rule checking (DRC) engine with parasitic extractor. That's all you have to do. I eagerly await your release on github.

I agree with you that open source really is the only valid path forward. However, you have to convince some very smart people to join your cause and they expect to be paid up front.


Yep, the domain knowledge is too narrow to develop free or open source tools like that. Someone with that knowledge needs to work full time to gain it, thus needs to make a living at it. The few open source tools come from universities were labor is cheap, and suffer from bit rot. I’m specifically thinking of MAGIC.


> inside of the big 3 EDA companies (Cadence, Synopsys and Mentor Graphics)

I never used their products, but having consulted inside Mentor Graphics I can tell you they were positively toxic. Couldn't get out of there fast enough as it was pretty clear there was no way we were going to be able to clean up that mess and they didn't like to listen to change.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: