That would depend on how memory and experience are represented. If they are just ledgers that the AI refers to, they most certainly are not suffering. Now if they have some kind of pain or pleasure function and their world is simulated and they have agency to seek or avoid things, then yeah, ethics should be involved. Or if we just don't understand how they work at all.
I would word this more like trauma or emotional impact. Horrible things could happen to you, but if it doesn't impact your life its ok. But as soon as we let past experiences impact future actions, now we have room for nuanced trauma. I feel like this is already possible in this simulation as past experience is fed in to generate future actions.