I often find it hard to believe that a single person can make much of a difference in such intricate problem domains as chip design but in his case the evidence is overwhelming. Also goes to show what a shit show Intel has become since even he was not able to right that ship. I think in the CPU space it will be all ARM and RISC ten years from now and since Intel never really managed to become a dominant player in any other (relevant) field, they are pretty much done for.
Part of it may also be the situation the company is in, and it' mindset, when Keller is hired.
There's no arguing that Keller is a smart guy, but he doesn't design an entire CPU architecture himself. If you're desperate and say "Okay, we are hiring the smartest guy we can find to build our new CPU, and give everything he needs to make it happen", then perhaps you get the AMD64, Zen or the A4 and A5. If you try to just dump a smart guy into a team as just another engineer, maybe you get nothing, like Intel.
Perhaps AMD, who already knew him, just gave Keller everything he need to build a team that can deliver on a new architecture, even when he's no longer there. Same with Apple. Intel on the other hand may have been unwilling to grant Keller the same level of autonomy and control. Then it also makes sense that he would leave Intel, for personal reasons, those being: "I can't work here, they won't let me do my job".
One of the tendencies of shrinking companies is exacerbated executive infighting.
If the company is growing, there are new X-of-Y positions to move up to.
If the company is stable or shrinking, people start watching out for their own careers with knives out.
AMD possibly avoided this because of size & realization of what needed to be done. Intel's too big & old: I would be very surprised if they weren't much more internally resistant to that sort of change.
And one can only deal with your colleagues throwing up brick wall after brick wall on every bit of minutiae for so long, as least if you're talented enough to have other options.
This is an absolutely on-point observation about company dynamics that a large number of people in the tech industry have never had to experience. It's why growth is so critical.
Past a certain size threshold the organizational and social dynamics of human relationships seems to be the predominant factor in getting anything done. i.e. There’s a limit to how big of a headwind we can cope with.
It is hard to believe. However out of my own experience: Execution, decision making, vision and goals, people, alignment, setbacks. A very complex mixture and the more people there are, the more important a driver is.
PS: Nice nickname. Read the book from Th. Mann - over a period of 7 years I think.
If you have someone in charge who knows how to run things and help people do their jobs better while having a vision of the future you have a much better chance of success.
I learned an important lesson at my first job out of school: high-quality tech people are more common than the person who can effectively lead them. It was a tough lesson for me because I had spent my entire academic career striving to become a top-notch engineer. Don't confuse a leader with a manager. They might be the same, they might not.
>Also goes to show what a shit show Intel has become since even he was not able to right that ship.
I think we are already starting to see fruit of his work. Intel doesn't need Jim Keller for CPU uArch design. Intel has had their uArch roadmap ready, and they were the best in the Industry if it wasn't for the 10nm delay. They also have work in the pipeline all being held back by their process node.
Jim described it in one of his interview ( Sorry I spent 10 min but couldn't find the source, so I may have remembered it wrong ) about not having process node held back your chip design, where he has experience in doing so in AMD and Apple. Being flexible enough to back port your design should anything happen as Plan B. Where previously Intel was just keep waiting for the process guys to fix it. That in itself is a huge workflow changes. It is hard to imagine the amount of work required to push this through especially with all the internal politics at Intel.
And Intel is at least looking at TSMC / alternative paths for some of their product lineup now ( Gaming Focused Large- Die Size GPU ) . Whether that is decided or not is unclear. But at least we have Rocket Lake launching soon which is sort of a half baked Willow Cove ( Used in Tiger Lake ) ported back to 14nm on Desktop. And we have Sapphire Rapid as well as other product roadmap hinting at multiple node ( Shown in Investor meetings notes ). That is at least showing Intel has changed their Internal design to be flexible enough in case of another 10nm like fiasco. And I think Jim Keller has some credit in this transition.
That is of course, having flexible design still doesn't fix their problem if TSMC is 2 years ahead of Intel in leading edge node design, volume and cost. And as I have repeatedly stated, Intel's problem is not design, but their business Model. And It would not surprise me if TSMC have shipped more 5nm wafers last year ( 2020 ) than Intel's entire 10nm production history since 2017.
Honest question, are we sure he didn't make a difference? Dude usually shows up, does the work with the team, leaves.. only year, two or plus we see the results.
There's a very big delay between finalizing a design and actually etching said design in silicon, and bringing a design to silicon of course involves a lot of non-trivial work. This is kind of why Intel had the whole tick-tock thing, as whilst the current design is being put into silicon, the design team can work on the next iteration of the design.Also why AMD could be very confident about their next Zen iteration being a lot faster when they were releasing Zen 2.
I only know the name of one prominent individual within the CPU space that is not a CEO or major scientist; and that's Jim Keller.
Why? Because I never ever see any articles talking about any other interesting employees. Every single time it's Jim Keller.
I'm sure he's good, his interview with Lex Fridman shows that he's knowledgeable and creative, but there's no way he's as exclusive a major force as the media portrays him.
Just going by the interview posted by another commenter, it seems to me a big reason is that he seems to enjoy the "people challenge" about as much as the technical challenge.
One argument is perhaps that bad management can be stifling and it can be hard to achieve good outcomes under bad management. The semiconductor space is perhaps difficult because you have very long lead times and the cost of each iteration is high: if you have different parts of the organisation pulling in different directions, you're unlikely to have a good outcome, and iterating to unify that direction is very difficult.
Novel development in a complex problem space isn't something one can just throw manpower at and expect progress. I'm sure that a certain amount of his fame is due to being a famous architect, as name recognition always compounds. There's no way he's more influential than the rest of the industry combined (shoulders of giants and whatnot), but I would be hard pressed to find press recognition of other hardware engineers. Still, he is undoubtedly exceptional.
For example, one could round up as many scientists as they could find in 1900, but there is no number that would guarantee the progress made in theoretical physics by someone like Einstein alone.
Worse than that I wonder if the trouble at Intel (e.g. inability to develop post 14nm chips plus one insane instruction set extension after another -- I wonder if the point of AMX is to have a big die area that is mostly unused that doesn't need to be cooled) isn't something that people like him are running from but rather something they are going to bring with then wherever they wind up.
>one insane instruction set extension after another
You're probably going to see a whole lot more of this sort of thing given the limits to process scaling. Keeping things simple and backwardly compatible made sense when you could just throw more transistors at the problem. Now you're seeing more and more specialized circuitry that software people are just going to have to deal with.
I am not against a new instruction. At first blush the new JavaScript instruction in arm might seem like a boondoggle but it is a simple arithmetic operation.
Compare that to the non-scalable SIMD instructions that mean you have to rewrite your code to take advantage of them and resultingly people don't bother to use them at all.
AMX allocates a huge die area to GEMM functionality that gets used a lot less in real numerics than you'd gather from reading a linear algebra textbook.
There are other approaches to the problems the industry faces other than 'fill up the die with registers that will never be used', nvidia and apple are going that way and that is why they are succeeding and Intel is failing.
As I understand it, Apple have a direct equivalent to Intel's AMX as an undocumented instruction set on their new Apple Silicon laptop processors, it just took a while for people to figure it out because the whole thing was hidden behind an acceleration library that is implemented very differently on Intel-based Macs.
> find it hard to believe that a single person can make much of a difference ... goes to show what a shit show Intel has become since even he was not able to right that ship.
These statements almost seem contradictory. What if instead of "not being able to right that ship", it is instead an example to the contrary?
Intel started out with memory. They never left and are on by far the cutting edge with memory tech. In particular Optane NVM DIMMs are so fast they basically define a new layer in the performance/cache hierarchy. Intel might see a shift in their focus over time away from CPUs to chalcogenide based persistent memory, where it seems they have held the lead for some time now.
Isn't 80+% of Intel's revenue from CPUs with another some percent from mobile chips? So while they may produce memory it's almost irrelevant as far as current revenue goes.