Data exfiltration was the most common malware in Sonatype report, with more than 4,400 packages designed to steal secrets, personally identifiable information, credentials, and API tokens.

  • I don’t want to use the term “fear mongering”, I think you may be a bit too concerned here.

    I’m concerned because I maintain numerous OSS projects and I now have to be justifiably concerned about supply chain attacks. Even Go projects tend to pull in tons of dependencies, and there’s a pattern I’m increasingly countering where some library will claim to be a “lightweight” or “small” library for X, but then I pulls in a dozen other projects each pulling in their dependencies. It isn’t lightweight if even one dependency is heavy, and I wish people would stop making this claim. But the security impact is that now there are dozens of projects I have to audit every time one of those dependencies does a version bump and I take it.

    This is an issue. It is an impediment to the people contributing to the Bazaar; it disincentivizes both developing and using OSS, and it’s harmful especially now when Linux is gaining more widespread popularity. I believe we need a concerted reaction.

    Go needs better security-focused static code analysis tools; there are any number of code quality checkers, but there are precious few security checkers and the ones that exist focus on developer practices, such as string sanitization. I want a reporting tool that will identify which of my dependencies make network connections, and where, and what kind of information is being sent, so that I can focus my audits. Ideally, the Go team would run a service that provides a health check for a package - a third party analysis users (developers and end users) can trust… but at this point I’d pay a monthly fee to be able to submit packages and get a badge.

    I think someone with InfoSec expertise could do a reasonable job with at least the statically compiled, modern languages, but I agree with your comment about it taking a community. If each PL community provided a static code security analysis tool, someone would eventually write a self-hostable service that could provide a score for most projects; at that point I’d expect this to become the purview of distributions - it’d be a significant value-add, a greater contribution than making yet another Ubuntu derivative that varies only be the default DE.

    Perhaps there are other tools such as LLM-driven code analysis; I’d expect that would be more effective with a model specifically trained to look for supply-chain attacks.

    I also contribute manifests to a couple of distributions, and I know neither of them do security gate keeping on the packages submitted by the simple fact that the time between submission and acceptance is too short for anyone to have performed an analysis.

    This is going to bite us; the damage it’s going to cause to OSS will be far worse if we, as a global community, react to a broadly newsworthy event than it will be if we’re proactive and prevent it.

    I don’t think the average Joe or Jill is going to be interacting with all sorts of random obscure FOSS projects like us more technical users are who program or experiment with services ourselves.

    Windows had an attack not so long ago

    Windows was long called less secure by Linux advocates merely due to the fact that virus makers were ignoring Linux as being too small to care about. That changed as the world’s internet infrastructure transitioned to being dominated by Linux.

    The issue I’m concerned about specifically is FOSS, regardless of the platform. In a full half of the projects I maintain, I create release builds for Linux, Windows, OSX (Darwin), and OpenBSD. The attack is on the FOSS model, where software is freely exchanged.

    We are welcoming an entirely new wave of Windows refugees, many of whom are less technical. They’re mostly going to be using FOSS when they arrive, and the nature of supply-chain attacks is that they can show up in any program, even main packages included in KDE, for example. Yes, they can also show up in commercial software, but unlike community-driven FOSS, commercial entities have the means to perform security audits and consumers have some legal recourse - an organization to litigate against.

    I’m advocating for a concerted, proactive effort by InfoSec specialists in the FOSS community to come up with 1) a manifesto about how we’re going to respond to supply-chain attacks and malicious software, 2) tooling to help developers audit their dependencies in whatever PL they’re using, and 3) some mechanism of publishing results, even if it’s self-hosted. In the last case, diligent users will check multiple hosters against each other, and probably a couple will emerge as “trusted providers;” if the Go teem hosted such a service, it would become the defacto authority. The Rust team could do the same.