You can use sin as activation function, but that would require careful initialization to avoid gradient explosion as you would ended up with a lot of points where gradient is simply zero. You can refer to Implicit Neural Representations with Periodic Activation Functions for more details.