Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Different meaning of "tracked." This is about static-analysis systems that seek to understand the "provenance" of the files that go into the container-image, so that they can alert you to vulnerabilities in the container's dependencies.

"Dark matter" here is anything these tools can't see / notice vulnerabilities in.



So any DB container by definition would have massively high percentage just because DB app itself is few tens of MB but database data is in tens of gigabytes ?

Seems like really useless metric for containers.

I can get it for OSes (some packages there do manage DB data, and even have option to remove it when removing package) but for container it does seem a bit pointless


No...? Again, we're talking about container images, not containers. Specifically, public container images sitting in registries like Docker Hub. People aren't burning their Postgres data into a container image and then pushing it, public-readable, to an image registry.

(But also, even ignoring that, I believe the metric used by the article is number-of-files, not byte-size. A DB might be large in byte-size, but is usually relatively negligible in number-of-files, usually holding individual table chunk files of 1GB or larger.)


As the container is the result of a build process, unless the tools aren't the build tools themselves, the whole container should be treated dark matter and just rebuild. It's process, not state.


It's the build process for the container-image (i.e. the Dockerfile or equivalent) that the tooling being discussed here is analyzing; not the resultant container image, nor containers spawned from said image.

The goal is, presumably, to figure out when a given docker image was created in such a way that it burns in a vulnerable version of some library; so that the author can be alerted that they need to (update their Dockerfile and) rebuild their image.

"Dark matter", under this definition, is anything that gets injected during the build process of the image, that is not itself traceable to some other versioned package management system with vulnerable-version deprecation. Without such information, an automated agent like the one described in the article cannot then propagate deprecations from consumed package-versions to produced image-tags.

A good example of such "dark matter" would be a static binary built outside the Dockerfile using a CI system, where the CI then creates a docker image by running a Dockerfile that simply injects the expected prebuilt binary into an image with an ADD stanza. Does that binary contain vulnerable versions of embedded static libraries? Who knows?


Not sure it is that easy. The Docker API provides introspection for those as well as also there is no Light Matter only because the example project is not using an ADD stanza any longer but the Dockerfile context is from a tar ball created by that project as a reproducible build artefact.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: