Conceptually loops are a very special case of recursion, and are mostly popular in languages that make some 'interesting' trade-offs in their implementation of function calls.
(Stacks are an implementation detail. Only some languages implement function calls with stacks.)
Can you give some examples of languages that don't use a stack for function calls? I'd love to read the assembly generated by their respective compilers to see how that looks like on the CPU level.
Languages that compile to continuations often put the continuations on the heap. So think Standard ML and Scheme implementations, not all of them of course. SML/NJ is a specific example.
That's one group of examples. Another group is languages that are not strict. Haskell is perhaps the most prominent one.
Programs compiled with GHC do use 'the stack', but for pattern matching, not for function calls. Whether 'stack frames' (ie the information associated with a function invocation) go on the heap or elsewhere is a bit complicated. Especially since the compiler does so much analysis and optimization.
Also eg, an implementation of Unlambda (based on SKI-calculus) wouldn't necessarily have a stack either.
Haskell compiled with GHC is one example. And the sister comment mentions certain Scheme/Racket implementations.
Something like Datalog wouldn't really have a call stack either, but that's deliberately not a Turing complete language. (But neither is Agda a Turing complete language, but it's still general purpose.)