It's not. However it is safer than C by default due to things like bounds-checking on arrays, slices (ptr+len), tagged unions, distinct typing for many things, an actual enum type, numerous checks for things like missing switch cases, and my more.
It's not trying to be memory safe, but rather try and catch many common mistakes that C does not catch easily.
As for use-after-free, I'd argue that is more of a problem with the memory allocation strategy in a language like C or Odin. And a change to something like an Arena like memory allocation strategy or something else reduces issues like that by a hell of a lot. `malloc`/`free` like approaches to memory allocation make use-after-free a lot more common because it making you micromanage allocations on a per-value level rather than on a per-shared-lifetime level. It's rare a single value has a unique lifetime, and I'd argue never in many projects.
One of the things I like about ObjC is that ARC is really really well done. The only time I care about memory management in ObjC is with cyclic references (eg: A->B and B->A). Those are about as rare as hen's teeth, and memgraph points them out if they ever do raise their head.
ObjC is pretty much the sweet spot for me in terms of language complexity and cognitive load over C, while providing so much more. It's really a shame that a lot of people (and I'm not pointing any fingers at anyone specifically) can't get past the [..] syntax, which comes from making ObjC a pure superset of C.
> And a change to something like an Arena like memory allocation strategy or something else reduces issues like that by a hell of a lot
So for example if I have a server, instead of allocating a Request object for each incoming request I should pre-allocate an arena of N request objects and just keep reusing them? Doesn't this blow up memory usage if I have a static pool of objects that may not be actually needed depending on traffic?
The language creator strongly suggests using arenas and zero-is-initialization. That's a programming paradigm that completely avoids having to keep track of zillions of small memory allocations, so the mechanisms needed to manage that are less important.