python-tuf [1] back then assumed that everything was manipulated locally, yes, but a lot has changed since then: you can now read/write metadata entirely in memory, and integrate with different key management backend systems such as GCP.
More importantly, I should point out that while Sigstore's Fulcio will help with key management (think of it as a managed GPG, if you will), it will not help with securely mapping software projects to their respective OIDC identities. Without this, how will verifiers know in a secure yet scalable way which Fulcio keys _should_ be used? Otherwise, we would then be back to the GPG PKI problem with its web of trust.
This is where PEP 480 [2] can help: you can use TUF (especially after TAP 18 [3]) to do this secure mapping. Marina Moore has also written a proposal called Transparent TUF [4] for having Sigstore manage such a TUF repository for registries like PyPI. This is not to mention the other benefits that TUF can give you (e.g., protection from freeze, rollback, and mix-and-match attacks). We should definitely continue discussing this sometime.
> Can SubtleCrypto accelerate any of the W3C Verifiable Credential Data Integrity 1.0 APIs? vc-data-integrity: https://w3c.github.io/vc-data-integrity/ ctrl-f "signature suite"
>> ISSUE: Avoid signature format proliferation by using text-based suite value The pattern that Data Integrity Signatures use presently leads to a proliferation in signature types and JSON-LD Contexts. This proliferation can be avoided without any loss of the security characteristics of tightly binding a cryptography suite version to one or more acceptable public keys. The following signature suites are currently being contemplated: eddsa-2022, nist-ecdsa-2022, koblitz-ecdsa-2022, rsa-2022, pgp-2022, bbs-2022, eascdsa-2022, ibsa-2022, and jws-2022.
> TUF "targets" roles may delegate to Fulcio identities instead of private keys, and these identities (and the corresponding certificates) may be used for verification.
s/fulcio/W3C DID/g may have advantages, or is there already a way to use W3C DID Decentralized Identifiers to keep track of key material in RDFS properties of a DID class?
What command(s) do I pass to pip/twine/build_pyproject.toml to build, upload, and install a package with a key/cert that users should trust for e.g. psf/requests?
> >The complexity designed into this system might make sense. TUF is very complex and not worth it for most projects, but Debian is exactly what TUF is designed for.
I disagree that TUF is too complicated for most projects. While our documentation, tutorials, and tooling can be better, the setup is about just as complicated as, say, devising an in-toto root layout. Most open source projects should really just worry about subscribing to something like PEP 480 and signing with one-time Fulcio keys. But I think we are largely on the same page here: yes, please just minisign/signify if you want simplicity, but if you want resilience from nation-state attacks, you need something like TUF (coupled with in-toto and sigstore). We are happy to advise.
> I disagree that TUF is too complicated for most projects. While our documentation, tutorials, and tooling can be better, the setup is about just as complicated as ...
I've heard great things about TUF but if you want people to adopt it then it seems like the documentation/tutorials/tooling should be a first class citizen
Thanks for your comment. I completely agree, and we are working on it. If you have any suggestions for documentation/tutorials/tooling you would like to see, I'd be happy to add them to the list.
We are actively working to improve reference implementation, to make it easier to maintain (easier to read code, type annotations, generally more Pythonic, cleaner design) and use (cleaner documented API, easier to plug in your own implementation of things a content update system might already have an opinionated implementation of -- i.e. the network communication stack).
We hope to build more tools on top of the cleaned up reference implementation once it is feature complete.
Note that TUF is great for things with multiple contributiors (think npm or pypa).
For the simple case of "a single publisher publishes update for a single product", TUF is an overkill. Something like signify or seccure will be way easier to set up and use.
signify is nice when key distribution, revocation, and rotation is handled for you... but how do you do that securely for many different publishers on a single repo?
More importantly, I should point out that while Sigstore's Fulcio will help with key management (think of it as a managed GPG, if you will), it will not help with securely mapping software projects to their respective OIDC identities. Without this, how will verifiers know in a secure yet scalable way which Fulcio keys _should_ be used? Otherwise, we would then be back to the GPG PKI problem with its web of trust.
This is where PEP 480 [2] can help: you can use TUF (especially after TAP 18 [3]) to do this secure mapping. Marina Moore has also written a proposal called Transparent TUF [4] for having Sigstore manage such a TUF repository for registries like PyPI. This is not to mention the other benefits that TUF can give you (e.g., protection from freeze, rollback, and mix-and-match attacks). We should definitely continue discussing this sometime.
[1] https://github.com/theupdateframework/python-tuf
[2] https://peps.python.org/pep-0480/
[3] https://github.com/theupdateframework/taps/blob/master/tap18...
[4] https://docs.google.com/document/d/1WPOXLMV1ASQryTRZJbdg3wWR...