This. I hire financial analysts regularly and it's similar in that it requires a high degree of proficiency in Microsoft Excel. People consistently over many years of hiring absolutely do lie about their level of proficiency. Sometimes it's over confidence, ignorance, but often just a straight lie. So, we do Excel modeling similar to this. It's truly the only way to get a good gauge of where the individual is in their proficiency and how they approach modeling problems. I don't see a way of eliminating them, but to OP's point, I do try to make the tests generic (no industry/business knowledge required), open ended (allowing them to solve the problem however they want) and generally "good". We always tell folks taking the test, "it's not meant to be trick questions, if the instruction is confusing ask us for clarification" and afterwards we discuss "how do you think you performed?", "what did you think?" those kinds of things to try to iterate on any of the poorly constructed aspects we may be blind to.
At least with Financial Analysts they actually use Excel daily (or often).
For me I have over 20 years of experience as lead architect, designer, engineer, of both apps and databases at a wide range of companies and industries. But it feels like that experience is out of touch with how most companies hire in the current market.
Whenever I get invited for FAANG interviews the recruiters tell me it's best to cram for months on stuff I would never/rarely use in real life and learned in theory over 20 years ago. I did one interview after prepping for an hour or two per day for two months and passed most interviewers but missed on one so they told me to try again in 6 months. Non-FAANG tech companies out of Silicon Valley seems just as bad from what recruiters have been sending me.
Although some of them FAANG companies pay so well (particularly META) it can make it worth it. That said after wasting so much time on just for the option to interview every 6 months again makes me wish I could do some other industry that pays well or better.
The current interview process makes me lose my interest in the profession but not actual development of apps and sites.
Is there a video demonstrating the kind of modeling you're talking about? The only thing I've seen that sounds similar was a Martin Shkreli video which is of course removed from YouTube now, and I don't know whether Shkreli was a beginner or an expert, just that he navigated Excel 20 times faster than I do.
There's some videos of high level skills if you search "excel modeling championship". Shkreli was a hedge fund/wall street guy if I remember correctly and probably has more of a banking background. That area of finance tends to be hyper competitive on Excel skills. I've heard stories of them banning mice from the office to force people to use keyboard shortcuts and what not. It's locker room braggadocious badge of honor in that world. But it's the type of application navigation that it sounds like you're describing.
I on the other hand am in realm of corporate finance. Think CFOs, Controllers, and their teams. It's a bit more relaxed here but you still need enough skill to get the job done effectively. We don't really care how you get the job done (use a mouse, keyboard, google, it's all game) but you need a nice output and your approach in an interview context should convince me that you'd be capable of solving similar problems in the future. For that reason, I typically provide crappy raw data (oddly and inconsistently formatted) and ask them to build or analyze it. Given some fictional general ledger data; can they build a 3 statement model? Can they summarize some payroll data in a logical way? Build a budget scenario based on the following actual financial data... also, here are 5 basic features the model should incorporate. I don't typically watch them so I don't know how they navigate the application (I don't find it important) but we give a time limit and stop them at time and review work attempted. The highest skilled folks tend to finish within time.
It's counter-intuitive but the worst thing for an analyst is to work at a company that has a great ERP/BI tools with reliable data, etc. It's only about 10% of companies that have their shit together in that regard and those analyst just run reports from XYZ GUI and never have to work with data. So they have low level of Excel skills but think they are great analysts. I've never actually experienced that world, but I know their analysts wouldn't cut it in the real world of crappy data and ad-hoc work. I've seen them crash and burn too many times and it's painful for the entire team.
Back to your original point, as you might expect, most folks that are great at Excel just use it a lot to build complex things. They may have learned the keyboard shortcuts along the way; or not. It just takes practice and muscle memory will kick in but also requires regular practice to keep sharp at it. There's also some pretty decent tooling to help you step your game up if you wish. I personally like https://macabacus.com/
This. I hire financial analysts regularly and it's similar in that it requires a high degree of proficiency in Microsoft Excel. People consistently over many years of hiring absolutely do lie about their level of proficiency. Sometimes it's over confidence, ignorance, but often just a straight lie. So, we do Excel modeling similar to this. It's truly the only way to get a good gauge of where the individual is in their proficiency and how they approach modeling problems. I don't see a way of eliminating them, but to OP's point, I do try to make the tests generic (no industry/business knowledge required), open ended (allowing them to solve the problem however they want) and generally "good". We always tell folks taking the test, "it's not meant to be trick questions, if the instruction is confusing ask us for clarification" and afterwards we discuss "how do you think you performed?", "what did you think?" those kinds of things to try to iterate on any of the poorly constructed aspects we may be blind to.