Purely because I already know it well enough to be semi-dangerous. My day job is designing physical goods, I don't have the time to learn anything new just to satisfy my curiosity with signed distance fields.
I don't think you realize just how horrendously bad Python would be for this application. The global interpreter lock and marshaling between Python objects and GPU state would absolutely kill any kind of performance.
Pick up Rust and Bevy. It should be pretty easy to mock up what you want, and you can dip into wgpu when you need extra fine control over the GPU.