They just moved a computationally intensive task to a much faster language. Not as large scale as moving everything from Rails to Scala in the case of Twitter for instance, but pretty standard is any large Rails application to use faster languages for performance sensitive tasks.
Yeah, but some languages are designed with performance in mind and others are not. So while people have made faster implementations of JS, Python, Ruby, etc over time, they're at a disadvantage compared to languages like C/C++, Fortran and Rust. You can't really compete with a good compiler for those languages, which is why WebAssembly, Julia and Crystal exist.
I can't seem to find references for it right now, but there are severe advantages statically typed languages have over dynamically typed in terms of performance and weakly typed over strongly typed in terms of safety. The stronger the type system in a language, the greater the array of type level optimizations the compiler can do.
There is simply no way of knowing enough to optimize when compiling a python script comparing to rust for example.
Language depth has a cost though in terms of raw productivity, that is why there are languages like matlab that are there simply for prototyping and nobody in their right mind would use it for production purposes
There are languages which don't permit fast implementations. You might consider a "fast language" to be one for which it's possible to write a fast implementation.
Now you have me thinking about whether it would be possible to design a non-fast language--that is, a language resistant even to tracing JIT. Programs in this language would have to defy the usual pattern of having hot loops that can be compiled once and run repeatedly. That seems to imply that operators must change behavior as they're repeated. In the typical machine model that's only suitable for a toy language, but it might make sense for a language the somehow compiles to neural net logic...