We don't know that "goal alignment" (to use the techo-cult name) is a hard problem; we don't know that it's an important problem; we don't even know what the problem is. We don't know that intelligence is "general problem solving." In fact, we can be pretty sure it isn't, because humans aren't very good at solving general problems, just at solving human problems.