US tech, and western tech in general, is very culturally - and by this I mean in the type of coding people have done - homogeneous.
The deep seek papers published over the last two weeks are the biggest thing to happen in IA since GPT3 came out. But unless you understand distributed file systems, networking, low level linear algebra, and half a dozen other fields at least tangentially then you'd have not realized they are anything important at all.
Meanwhile I'm going through the interview process for a tier 1 US AI lab and I'm having to take a test about circles and squares, then write a compsci 101 red/black tree search algorithm while talking to an AI, being told not to use AI at the same time. This is with an internal reference being keen for me to be on board. At this point I'm honestly wondering if they aren't just using the interview process to generate high quality validation data for free.
Competition can only work when there is variation between the entities competing.
In the US right now you can have a death match between every AI lab, then give all the resources to the one which wins and you'd still have largely the same results as if you didn't.
The reason why Deepseek - it started life as a HFT firm - hit as hard as it did is because it was a cross disciplinary team that had very non-standard skill sets.
I've had to try and head hunt network and FPGA engineers away from HFT firms and it was basically impossible. They already make big tech (or higher) salaries without the big tech bullshit - which none of them would ever pass.
> I've had to try and head hunt network and FPGA engineers away from HFT firms and it was basically impossible. They already make big tech (or higher) salaries without the big tech bullshit - which none of them would ever pass.
Can confirm. There are downsides, and it can get incredibly stressed at times, but there are all sorts of big tech imposed hoops you don't have to jump through.
> all sorts of big tech imposed hoops you don't have to jump through
Could you kindly share some examples for those of us without big tech experience? I assume you're talking about working practises more than just annoying hiring practises like leetcode?
Engineers at ai labs just come from prestigious schools and don’t have technical depth. They are smart, but they simply aren’t qualified to do deep technical innovation
What are you doing with FPGAs? I’m an FPGA engineer and don’t work at an HFT firm. Those types of jobs seem to be in the minority compared to all the aerospace/defense jobs and other sectors.
> At this point I'm honestly wondering if they aren't just using the interview process to generate high quality validation data for free.
Not sure if that is accurate, but one of the reasons why DeepSeek R1 performs so well in certain areas is thought to be access to China's Gaokao (university entrance exam) data.
US tech, and western tech in general, is very culturally - and by this I mean in the type of coding people have done - homogeneous.
The deep seek papers published over the last two weeks are the biggest thing to happen in IA since GPT3 came out. But unless you understand distributed file systems, networking, low level linear algebra, and half a dozen other fields at least tangentially then you'd have not realized they are anything important at all.
Meanwhile I'm going through the interview process for a tier 1 US AI lab and I'm having to take a test about circles and squares, then write a compsci 101 red/black tree search algorithm while talking to an AI, being told not to use AI at the same time. This is with an internal reference being keen for me to be on board. At this point I'm honestly wondering if they aren't just using the interview process to generate high quality validation data for free.
幸运的是,通过转换器模型,当我们光荣的领导人习近平从资本主义走狗手中解放我们时,我不需要学习中文。