211 points by vinhnx 6 hours ago | 17 comments
s_ting765 4 minutes ago
[delayed]
12_throw_away 2 hours ago
I don't have much experience with GitHub's CI offering. But if this is an accurate description of the steps you need to take to use it securely ... then I don't think it can, in fact, ever be used securely.

Even if you trust Microsoft's cloud engineering on the backend, this is a system that does not appear to follow even the most basic principles of privilege and isolation? I'm not sure why you would even try to build "supply-chain security" on top of this.

lrvick 2 hours ago
The only binaries of uv in the world you can get that were full source bootstrapped from signed package commits to signed reviews to multi-signed deterministic artifacts are the ones from my teammates and I at stagex.

All keys on geodistributed smartcards held by maintainers tied to a web of trust going back 25 years with over 5000 keys.

https://stagex.tools/packages/core/uv/

Though thankful for clients that let individual maintainers work on stagex part time once in a while, we have had one donation ever for $50 as a project. (thanks)

Why is it a bunch of mostly unpaid volunteer hackers are putting more effort into supply chain security than OpenAI.

I am annoyed.

duskdozer 1 hour ago
>Why is it a bunch of mostly unpaid volunteer hackers are putting more effort into supply chain security than OpenAI.

Unpaid volunteer hackers provide their work for free under licenses designed for the purpose of allowing companies like OpenAI to use their work without paying or contributing in any form. OpenAI wants to make the most money. Why would they spend any time or money on something they can get for free?

hootz 1 hour ago
Yep. Permissive licenses, "open source", it's all just free work for the worst corporations you can think.
philipallstar 1 hour ago
It's free work for anyone.
ra 1 hour ago
Not sure if you're fully over the context that openAI bought Astral - who "own" uv.
blitzar 31 minutes ago
The private jet wont fuel itself now will it.
saghm 41 minutes ago
> Why is it a bunch of mostly unpaid volunteer hackers are putting more effort into supply chain security than OpenAI.

Didn't the acquisition only happen a few weeks ago? Wouldn't it be more alarming if OpenAI had gone in and forced them to change their build process? Unless you're claiming that the article is lying about this being a description of what they've already been doing for a while (which seems a bit outlandish without more evidence), it's not clear to me why you're attributing this process to the parent company.

Don't get me wrong; there's plenty you can criticize OpenAI over, and I'm not taking a stance on your technical claims, but it seems somewhat disingenuous to phrase it like this.

pabs3 1 hour ago
What are you using for signed reviews?
lrvick 57 minutes ago
I promise we are actively working on a much better solution we hope any distro can use, but... for now we just enforce signed merge commits by a different maintainer other than the author as something they only do for code they personally reviewed.
sevg 5 hours ago
FYI it was actually William Woodruff (the article author) and his team at Trail of Bits that worked with PyPI to implement Trusted Publishing.
raphinou 5 hours ago
One (amongst other) big problem with current software supply chain is that a lot of tools and dependencies are downloaded (eg from GitHub releases) without any validation that it was published by the expected author. That's why I'm working on an open source, auditable, accountless, self hostable, multi sig file authentication solution. The multi sig approach can protect against axios-like breaches. If this is of interest to you, take a look at https://asfaload.com/
darkamaul 4 hours ago
I’m maybe not understanding here, but isn’t it the point of release attestations (to authenticate that the release was produced by the authors)?

[0] https://docs.github.com/en/actions/how-tos/secure-your-work/...

raphinou 4 hours ago
Artifact attestation are indeed another solution based on https://www.sigstore.dev/ . I still think Asfaload is a good alternative, making different choices than sigstore:

- Asfaload is accountless(keys are identity) while sigstore relies on openid connect[1], which will tie most user to a mega corp

- Asfaload ' backend is a public git, making it easily auditable

- Asfaload will be easy to self host, meaning you can easily deploy it internally

- Asfaload is multisig, meaning event if GitHub account is breached, malevolent artifacts can be detected

- validating a download is transparant to the user, which only requires the download url, contrary to sigstore [2]

So Asfaload is not the only solution, but I think it has some unique characteristics that make it worth evaluating.

1:https://docs.sigstore.dev/about/security/

2: https://docs.sigstore.dev/cosign/verifying/verify/

arianvanp 4 hours ago
The problem is nobody checks.

All the axios releases had attestations except for the compromised one. npm installed it anyway.

raphinou 4 hours ago
Yes, that's why I aim to make the checks transparant to the user. You only need to provide the download url for the authentication to take place. I really need to record a small demo of it.
est 2 hours ago
> without any validation that it was published by the expected author

SPOF. I'd suggest use automatic tools to audit every line of code no matter who the author is.

snthpy 4 hours ago
Overall I believe this is the right approach and something like this is what's required. I can't see any code or your product though so I'm not sure what to make of it.
raphinou 4 hours ago
Here's the GitHub repo of the backend code: https://github.com/asfaload/asfaload

There's also a spec of the approach at https://github.com/asfaload/spec

I'm looking for early testers, let me know if you are interested to test it !

dirkc 4 hours ago
The open source ecosystem has come very far and proven to be resilient. And while trust will remain a crucial part of any ecosystem, we urgently need to improve our tools and practices when it comes to sandboxing 3rd party code.

Almost every time I bump into uv in project work, the touted benefit is that it makes it easier to run projects with different python versions and avoiding clashes of 3rd dependencies - basically pyenv + venv + speed.

That sends a cold shiver down my spine, because it tells me that people are running all these different tools on their host machine with zero sandboxing.

Oxodao 3 hours ago
meh not always. I do use uv IN docker all the time, its quite handy
dirkc 3 hours ago
Honest question - what are the main benefits for you when you use it in docker?

ps. I feel like I've been doing python so long that my workflows have routed around a lot of legit problems :)

silvester23 1 hour ago
For us, the DX of uv for dependency management is much better than just using pip and requirements.txt.

To be clear though, we only use uv in the builder stage of our docker builds, there is no uv in the final image.

Oxodao 2 hours ago
Mainly the "project" system. I'm only developing python in my free time, not professionally so I'm not as well versed in its ecosystem as I would be in PHP. The fact that there's tons of way to have project-like stuff I don't want to deal with thoses. I used to do raw python containers + requirements.txt but the DX was absolutely not enjoyable. I'm just used to it now
sersi 3 hours ago
Main reason I now use uv is being able to specify a cool down period. pip allows it but it's with a timestamp so pretty much useless..

And that doesn't prevent me from running it into a sandbox or vm for an additional layer of security.

zwp 3 hours ago
> pip allows it but it's with a timestamp

A PR to be able to use a relative timestamp in pip was merged just last week

https://github.com/pypa/pip/pull/13837/commits

carderne 3 hours ago
If anyone from Astral sees this: at this level of effort, how do you deal with the enormous dependence on Github itself? You maintain social connections with upstream, and with PyPA... what if Github is compromised/buggy and changes the effect of some setting you depend on?
Zopieux 2 hours ago
The entire paragraph about version pinning using hashes (and using a map lookup for in-workflow binary deps) reminds me that software engineers are forever doomed to reinvent worse versions of nixpkgs and flakes.

I don't even love Nix, it's full of pitfalls and weirdnesses, but it provides so much by-default immutability and reproducibility that I sometimes forget how others need to rediscover this stuff from first principles every time a supply chain attack makes the news.

nDRDY 1 hour ago
>worse versions of nixpkgs and flakes

You mean statically-compiled binaries and hash pinning? Those have been around a bit longer than Nix :-)

Zopieux 37 minutes ago
Were they deployed at scale in such a way that most (open and some non-free) software is packaged as such? I've never seen this happen until nixpkgs.
darkamaul 5 hours ago
With the recent incidents affecting Trivy and litellm, I find it extremely useful to have a guide on what to do to secure your release process.

The advices here are really solid and actionable, and I would suggest any team to read them, and implement them if possible.

The scary part with supply chain security is that we are only as secure as our dependencies, and if the platform you’re using has non secure defaults, the efforts to secure the full chain are that much higher.

trashcan2137 4 hours ago
Lengths people will go to rediscover Nix/Guix is beyond me
3abiton 4 hours ago
I don't see the connection though?
Eufrat 4 hours ago
Nix provides declarative, reproducible builds. So, ostensibly, if you had your build system using Nix, then some of the issues here go away.

Unfortunately, Nix is also not how most people function. You have to do things the Nix way, period. The value in part comes from this strong opinion, but it also makes it inherently niche. Most people do not want to learn an entire new language/paradigm just so they can get this feature. And so it becomes a chicken and egg problem. IMHO, I think it also suffers from a little bit of snobbery and poor naming (Nix vs. NixOS vs. Nixpkgs) which makes it that much harder to get traction.

diffeomorphism 3 hours ago
There are different notions of "reproducible". Nix does not automatically make builds reproducible in the way that matters here:

https://reproducible.nixos.org

It is still good at that but the difference to other distros is rather small:

https://reproducible-builds.org/citests/

trashcan2137 2 hours ago
Nix, if not used incorrectly (and they really make it hard to use it, both correctly and incorrectly lol), gives you reproducible and verifiable builds.

Unfortunately I have to agree with the sibling comment that it suffers from poor naming and the docs are very hard to grok which makes it harder to get traction.

I really hate the idea of `it's all sales at the end of the day` but if Nix could figure how to "sell" itself to more people then we would probably have less of those problems.

sunshowers 1 hour ago
If it doesn't work on Windows, it is not a full replacement.
Zopieux 2 hours ago
Reading the paragraph on hash pinning and "map lookup files" (lockfiles) made me audibly sigh.
tao_oat 3 hours ago
This is a really great overview; what a useful resource for other open-source projects.
ChrisArchitect 5 hours ago
Earlier submission from author: https://news.ycombinator.com/item?id=47691466
NeoBild 4 hours ago
[dead]
gauravkashyap6 4 hours ago
[dead]
jiusanzhou 4 hours ago
[dead]
darioterror 6 hours ago
[dead]
ramoz 4 hours ago
Created an agent skill based on this blog. Assessing my own repos now.

https://github.com/backnotprop/oss-security-audit