IMO, GZ compression is really something that should be handled by your webserver and most do it easily. There is a jekyll-gzip plugin if for some reason you need it.
The content is static, and webservers like nginx support service gzip compressed equivalents if it finds it alongside the uncompressed ones. Might as well do that compression once rather than multiple times.
If you're doing it just once, you can also realistically take the extra effort and use zopfli for further (albeit slower) compression.
Most of the major web server software caches the gzipped content. It only happens once.
Plus you can actually look at the files on the server without modification or piping them to gzip to check the files.
You can also elect not to gzip files smaller than MTU (1500 bytes) and stop wasting your and your client's time. Realistically it's often wasteful to gzip small files (below 5KB).
If you're worried about squeezing maximum compression out of your text files, then you're serving up too much stuff to clients anyway. You probably don't care about your users... just about your bloated page loading fast. Get rid of some ad garbage.
That is what GitLab Pages expects, and because I can control my own pipeline with GitLab I am able to support multiple file formats that GitHub would not be able to support. It doesn't get any simpler than the example I provided.
I'll try and contain my derision towards serious conversations about hosting performance-sensitive content on GitHub/GitLab Pages. As explained in a separate comment, gziping _everything_ is wasting CPU.
If you just gzip from nginx, have content caching enabled (why don't you with static content?), and set a minimum size for gzipping, you have the best possible scenario and take a small hit on first requests of files until server-restart/cache-expiration.