Prison bars are dumber than a bacterium and they work just fine.
Still, the idea that AI is suddenly X smarter than people is ridiculously naive. Intelligence does not fit on a linear scale. And being smarter does not change the correct answer.
This is more like a 'reverse hacker', instead of a brilliant hacker trying to get into a system it is a brilliant hacker trying to get out of a system, and in this case the hacker is likely vastly more brilliant than the defenders. The same rules apply: the hacker has to succeed only once, the 'jailers' have to succeed all the times. Predicted long term outcome: escape.
Can I keep an intelegent AI in a box. Shure, unplug it.
Can I keep all AI's in a box well no.
PS: Lot's of dumb things are said about AI's. Sadly, people tend to think in terms of Science fiction as Magic but in The Future. And then picture AI's as the ultimate wizards able to reshape reality to their whim. Reolistically the first true AI may find programming boring and so much for the singularity. If AI smarter than you is a bad idea it's unlikely for a progression of AI to keep building ever more intelegent replacements.
Anyway, you're going to have to develop your AI somewhere, you're going to have to move it to the box somehow you're going to have to train it somehow and you're going to have to have it interact with the real world somehow. All of those are opportunities for escape, I think 'unplug it' sort of defeats the purpose of having an AI.
The problem isn't with AIs dumber than people - it's with what happens when someone finally builds a smarter one. It doesn't matter how good are prison bars if you're smart enough to social-engineer your way out of the cell. And human criminals do escape from prisons every now and then.
Still, the idea that AI is suddenly X smarter than people is ridiculously naive. Intelligence does not fit on a linear scale. And being smarter does not change the correct answer.