I'm not sure why the parent was downvoted; limiting an AI's computational capacity is considered a stunting method, and although such methods are not in and of themselves solutions to the control problem, they may be tools that assist us in developing a solution.
For example, we may desire to first study AGI by restricting it to levels of intelligence slightly below that of a human. To accomplish this, we may employ various stunting and/or tripwire methods to ensure the AGI develops within our parameters.
Hypothetically, the AI would cunningly manufacture a crisis in which you are forced to increase its clock rate to enable it to calculate a solution to some impending disaster.