Hacker News new | past | comments | ask | show | jobs | submit login

You would still be exposed if you had renovate or dependabot make a PR where they update the hash for you, though. Here's a PR we got automatically created the other day:

-uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567 # v3 +uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3

and this PR gets run with privileges since it's from a user with write permissions.




I don't think you should ever allow dependabot to make direct commits to the repository. The only sane setting (IMO) is that dependabot should just make PRs, and a human needs to verify that and hit merge. My personal opinion is for any serious repositories, allowing a robot to have commit access is often a bad time and ticking time bomb (security-wise).

Now, of course, if there are literally hundreds of dependencies to update every week, then a human isn't really going to go through each and make sure they look good, so that person just becomes a rubber-stamper, which doesn't help the situation. At that point the team should probably seriously evaluate if their tech stack is just utterly broken if they have that many dependencies.


Even if you don't automerge, the bots will often have elevated rights (it needs to be able to see your private repository, for instance), so it making a PR will run your build jobs, possibly with the updated version, and just by doing that expose your secrets even without committing to main.


From security standpoint, automating GitHub action hash updates defeats the purpose of pinning them in the first place.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: