I assume you are talking about a Windows/Linux PC. I have done that and got a Threadripper with 32 cores and 256Gb RAM. It runs any llama on the CPU although the 65/70b are quite slow. I also added an A6000 (48Gb VRAM) that allows you to run the 65/70b quantized with very good performance.
If you are going with the GPU and don't care about loads of RAM then a 16 Zen CPU will do just fine (or Intel for that matter).
If you are only interested in llama only then an M1 Studio with 64Gb RAM is probably cheaper and will work just as well.
If you are going with the GPU and don't care about loads of RAM then a 16 Zen CPU will do just fine (or Intel for that matter).
If you are only interested in llama only then an M1 Studio with 64Gb RAM is probably cheaper and will work just as well.