True, however this seems like such basic stuff. Download arbitrary text and inject it into your prompt?
Why on earth would you not consider that as a very dangerous operation that needs to be carefully managed? It's like parking your bike downtown hoping it wont be stolen. Like, at least use a zip tie or something.
That said, I agree with your post that this won't catch everything. So something else, like a quarantined LLM like you suggest is likely needed.
However I just didn't expect such blatant attacks to pass.