Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The rising cost of software in the design process is startling. I wonder what opportunity there is for new entrants to reduce that cost.


Jeff Dean's recent talk on ML for hardware design seems like a great application of the tech in a space where we are seeing design process costs balloon (see https://www.youtube.com/watch?v=FraDFZ2t__A).


I was actually wondering if someone could explain where that cost increase is coming from. I know the design rules get more complicated as the process node shrinks, but I thought most of those design rules are essentially “taken care of”, because customers use building blocks from the foundry that already have those design rules baked in? And it’s still using the same software I thought?


a lot of it is that to get continued gains, you run out of easy stuff to optimize. when Moore was alive and well, the job of chip designers was to build abstractions that let them cheaply scale down their designs without introducing too much overhead. now, is you want to announce 30% gen on gen improvement, you can only count on the fab to give you half of that (and even that has gotten harder. co-optimization is now needed, but is really hard). for the other half, you now need to hunt down every last inefficiency that you previously accepted to make your life easier. pure digital signals go to pam4. layout becomes less regular. you start trying to optimize the whole chip rather than just combining optimized pieces. then in 3 years, you have to find another 15% and the process repeats, but this time you have used up all the low hanging improvements.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: