alt.hn

3/26/2025 at 5:53:47 PM

Malware found on NPM infecting local package with reverse shell

https://www.reversinglabs.com/blog/malicious-npm-patch-delivers-reverse-shell

by gnabgib

3/26/2025 at 7:15:02 PM

The fact that http fetches and fs reads don't prompt the user are continually the craziest part of the `npx` and `package.json`'s `postinstall`.

Does anyone have a solution to wrap binary execution (or npm execution) and require explicit user authorization for network or fs calls?

by love2read

3/26/2025 at 7:39:40 PM

I believe the Deno permission system[0] does what you're asking, and more.

(Deno is a JavaScript runtime co-created by Ryan Dahl, who created Node.js - see his talk "10 Things I Regret About Node.js"[1] for more of his motivations in designing it.)

[0] https://docs.deno.com/runtime/fundamentals/security/

[1] https://www.youtube.com/watch?v=M3BM9TB-8yA

by jrmann100

3/26/2025 at 9:08:47 PM

Yes, explicitly asking you if you want to run the install script is the first warning (which pnpm can do too)

Then would halt due to file access or network permissions.

Could still get you if you lazily allow all everywhere though and this is why you shouldn’t do that.

by DimmieMan

3/26/2025 at 11:42:26 PM

Yes and you can run almost every npm packages:

  deno run npm:@angular/cli --help

by simlevesque

3/26/2025 at 8:15:28 PM

pnpm skips all `postInstall` runs by default now. You can explicitly allow-list specific ones.

If you use that, I'd highly recommend configuring it to throw an error instead of just silently skipping the postInstall though: https://github.com/karlhorky/pnpm-tricks#fail-pnpm-install-o...

by bilalq

3/27/2025 at 12:04:14 AM

Bun does the same.

by spiffytech

3/27/2025 at 3:57:04 AM

Sure, but switching from node to bun is a much more invasive change than switching from npm to pnpm. And not always possible.

by bilalq

3/27/2025 at 11:54:47 PM

It's quite easy, actually. We did this at work, recently.

Bun is two things. Most commonly, it's known for its Node-competitor runtime, which is of course a very invasive change. But it can also be used purely as a package manager, with Node as your runtime: https://bun.sh/docs/cli/install

As a package manager, it's much more efficient, and I would recommend switching over. Haven't used pnpm, though--we came from yarn (v2, and I've used v1 in past).

We still use Node for our runtime, but package install time has dropped significantly.

This is especially felt when switching branches on dev machines where one package in the workspace has changed, causing yarn to retry all packages (even though yarn.lock exists), where bun only downloads the new package.

by croshan

3/27/2025 at 9:43:04 AM

Yes. I was more pointing out that blocking postinstall scripts is becoming a trend across multiple projects. Possibly a portent for the ecosystem as a whole. I could have communicated that more clearly.

by spiffytech

3/27/2025 at 2:20:49 AM

I will say for all the (sometimes valid) complaints about NPM and the ecosystem, I don’t hear about Go.

Go encourages package authors to simply link to their git repository. It is quite literally cloning source files onto your computer without much thought

by no_wizard

3/27/2025 at 4:20:50 AM

No code execution during dependency fetching. And the dependency tree is very shallow for most project, making it easier to audit.

by skydhash

3/27/2025 at 10:58:13 AM

still no guard rails, simply raw source code. It would be easy for anything to be hiding within. Given observed behavior I doubt most people are auditing the source either

It’s ripe for an exploit

by no_wizard

3/27/2025 at 1:46:32 PM

That's a different issue, which most libraries will have - when you run their code, they may do extra things.

This is talking about the same thing, but at install time as well.

by robertlagrant

3/27/2025 at 9:08:01 PM

Two differences:

Best practice now is to not run postinstall scripts by default. Yarn and pnpm allow you to do this (and pnpm at least won’t run them by default) and I believe npm now does too, and is looking at a future where it won’t run them by default.

The other difference is Go had several chances to do better, and they didn’t take any steps to do so.

The maintainers of NPM (the registry and tool) I’m sure would love to make a lot of changes to the ecosystem but they can’t make some of them without breaking too much, and at the scale that NOM operates it’s going to always be playing catch up with work around a and such for previous choices so they don’t say, break hundreds of thousands of CI runs simultaneously.

Go iterated on its package ecosystem several times and ultimately did very little with it. They didn’t make it vastly more secure by default in any way, they were actually going to get rid of vendoring at one point, and a whole host of other SNAFUs.

Go’s packaging and distribution model while simple is extremely primitive and they have yet to really adopt anything in this area that would be beneficial for security

by no_wizard

3/26/2025 at 7:32:25 PM

Just package node_modules subdirectories as tar files.

I stopped using npm a while back and push and pull tar files instead.

Naturally I get js modules from npm in the first place, but I never run code with it after initial install and testing of a library for my own use.

by teknopaul

3/26/2025 at 9:48:13 PM

This is a valid choice, but you must accept some serious trade-offs. For one thing, anyone wanting to trust you must now scrutinize all of your dependencies for modification. Anyone wanting to contribute must learn whatever ad hoc method you used to fetch and package deps, and never be sure of fully reproducing your build.

The de facto compromise is to use package.json for deps, but your distributable blob is a docker image, which serializes a concrete node_modules. Something similar (and perhaps more elegant) is Java's "fat jar" approach where all dependencies are put into a single jar file (and a jar file is just a renamed zip so it's much like a tarball).

by simpaticoder

3/27/2025 at 2:14:54 AM

May not be a well known feature however npm can unpack tarballs as part of the install process, as that’s how they’re served from the CDN.

If you vendor and tar your dependencies correctly you could functionally build a system around trust layers by inspecting hashes before allowing unpacking for instance.

It’s a thought exercise certainly but there might be legs to this idea

by no_wizard

3/27/2025 at 2:30:33 AM

I think Yarn zero install is now the default, and does the same thing you're advocating? I'm not really a JS person, but it looks like it's done reasonably competently (validating checksums etc).

by CBLT

3/27/2025 at 3:50:07 AM

Same as Python (setup.py). It's even worse in Go, as they encourage to just link github repos at the currently latest version.

Only Java, .Net and R just download files, at a declared (reproducible) version.

by deepsun

3/26/2025 at 9:58:47 PM

You're already including arbitrary code into your application. Supposedly you're intending to run that application at some point.

by delusional

3/26/2025 at 10:13:55 PM

What is the answer to that? Learn x86 and bootstrap?

by nextts

3/27/2025 at 12:39:24 AM

Capability-based security within Node. The main module gets limited access to the system (restricted by the command-line, with secure defaults), all dependencies have to be explicitly provided with capabilities they need (e.g. instead of a module being able to import "fs", it receives only an open directory handle to the directory that the library consumer dictates). Deno already does the first half of this.

by CGamesPlay

3/27/2025 at 4:08:12 AM

I kinda wish there were programming languages, where instead of saying `import module`, you said "I must be run in a context where I have access to a function with this prototype". Effectively functions instead of modules, alongside duck-typed OO if you use OO.

The problem is, as soon as it becomes remotely popular, every module is going to end up saying "I must be run in a context where I have access to all the functions version 13.2 of the filesystem module wrapped up in a structure that claims to be version 13.2 of the filesystem module and which has been signed by the private key that corresponds to the filesystem module author's public key" - even though they only need a random access file handle for use as a temporary file - because otherwise developers will be anxious about leaked implementation details preventing them from making version 1.4.16 (they'll just have to make version 2.0 - who cares? their implementation detail is my security).

by squiggleblaz

3/27/2025 at 1:58:09 PM

As an alternative (and this is what capability-based design is all about), instead of replacing dependencies, just only give access to system calls to the main function of the app. It has to pass those on to any dependencies it wants, and so forth. The system calls are a small, mostly unchanging set of primitive items, and any dependency can wrap them up in whatever API suits them.

Example: in order to write to open a file, you need a capability object corresponding to write access to the file's parent directory. Now you can be sure that a dependency doesn't write any files unless you actually pass it one of these capability objects.

by CGamesPlay

3/27/2025 at 11:09:21 AM

I think the WASM/WASI environment may be closest to this. But it's an interesting idea.

by pjc50

3/26/2025 at 10:44:32 PM

The lavamoat npm package does something similar. It's maintained by the security team at MetaMask (crypto wallet extension and app). It's used in the extension runtime as well as wraps the build process.

by davidmurdoch

3/27/2025 at 12:26:31 AM

We built “safe npm”, a CLI tool transparently wraps the npm command and protects developers from malware, typosquats, install scripts, protestware, telemetry, and more.

You can set a custom security policy to block or warn on file system, network, shell, or environment variable access.

https://socket.dev/blog/introducing-safe-npm

by feross

3/26/2025 at 8:21:58 PM

npm should run in Docker containers by default. At least to restrict access to the project being built.

But the result of a compiler will run on the machine anyway, but once again, it should be in a Docker.

by eastbound

3/26/2025 at 10:52:36 PM

It blows my mind that developers will install things like npm, random libraries and so on on their machine, sometimes their personal one with the keys to the kingdom so to speak. But then again, people are now installing MCP servers the same way and letting LLMs run the show. Incredible really.

by namaria

3/27/2025 at 12:57:50 PM

So, what do you do for Windows and macOS users, in corporate environments, who don’t have access to virtualization on local machines? This describes most of the places I’ve worked as a consultant.

Container technology is awesome, and it’s a huge step forward for the industry, but are places where it’s not feasible to use, at least for now.

by bshacklett

3/26/2025 at 9:59:33 PM

Docker is not a security boundary.

by delusional

3/26/2025 at 10:21:02 PM

Sure it is. It isn't airtight but then what is?

Even KVM escapes have been demonstrated. KVM is not a security boundary ... except that in practice it is (a quite effective one at that).

Taken to the extreme you end up with something like "network connected physical machines aren't a security boundary" which is just silly.

by fc417fc802

3/27/2025 at 4:04:53 AM

> Taken to the extreme you end up with something like "network connected physical machines aren't a security boundary" which is just silly.

1. This is why some places with secret enough info keep things airgapped.

2. OTOH, from what I recall hearing the machines successfully targeted by Stuxnet were airgapped.

by tbrownaw

3/27/2025 at 8:09:45 AM

Yeah, you have to move it off-planet to achieve an actual security boundary.

In our threat model the upper bound on the useful lifetime of the system is limited by the light-distance time from the nearest adversary.

by terom

3/27/2025 at 9:25:20 AM

Ah yes, the "maximally aggressive grey goo" threat model.

by fc417fc802

3/27/2025 at 1:50:33 PM

No software is perfect but there is a massive difference betwen these two boundaries. If there is a escape in KVM its news worthy unlike in docker. I don't feel like pulling up cves but anybody following the space should know this.

by akimbostrawman

3/27/2025 at 4:19:24 PM

There's an even bigger difference between using Docker and not using any sort of protection, it's always going to be a security vs convenience tradeoff. Telling people who want to improve their security posture (currently non-existent) that "Docker is not a security boundary" isn't very pragmatic.

What percentage of malware is programmed to exploit Docker CVEs vs. just scanning $HOME for something juicy? Swiss cheese model comes to mind.

by dns_snek

3/28/2025 at 7:21:33 AM

It is better the same way a rope is better than no seat belt at all. Recommending Docker as a sandbox gives a false sense of security.

by akimbostrawman

3/26/2025 at 7:20:54 PM

Use Rust

by 2OEH8eoCRo0

3/26/2025 at 7:25:19 PM

According to the comment below, it should be “Use Java”.

by user432678

3/26/2025 at 7:45:15 PM

My comment was made in jest

by 2OEH8eoCRo0

3/26/2025 at 8:13:19 PM

Definitely stop using jest

by SCdF

3/26/2025 at 8:42:39 PM

I bet someone already had entertained an idea to add cryptominer to Jest, nobody would notice slight increase to those tests running times on CI. Maybe it could even start funding those open source maintainers enough to finally make ES6 modules non-experimental.

by user432678

3/26/2025 at 8:43:26 PM

Downloads used in infrastructure... VSCode Extensions, Github repos, PyPI, NPM, etc. all need to be scrutinized.

Open source at least has the option to audit; closed source (or "closed build" stuff like 7zip) is at far higher risk: mostly just VirusTotal which mostly will not catch backdoors of this type.

Mainland China, Russia, North Korea, use these vectors for corporate and government espionage: https://www.youtube.com/watch?v=y27B-sKIUHA ...XZ, Swoole are 2 examples off the top of my head.

by geenat

3/26/2025 at 7:16:16 PM

Malware in a crypto-related JavaScript package. Surprised Pikachu face

by phito

3/26/2025 at 8:30:53 PM

I'd like to see a world where the JS community focused more on improving the stdlib across the browser and in nodejs - much like bun is doing. Common packages for node such as mysql2, axios etc. are so widely used and are huge attack vectors should they ever be compromised.

by deanc

3/26/2025 at 9:57:33 PM

Deno is perhaps a better example with browser API's, part off the winterTC committee and a growing set of std packages[1].

Possibly more importantly, it has a security model that defends against this kind of exploit.

I will agree with the sentiment though, I get not wanting to jump on new shiny things but for some reason I keep getting the vibe that the community is closer to crab mentality than healthy skepticism, downright hostile towards any project making a genuine effort to improve things.

[1]https://jsr.io/@std

by DimmieMan

3/27/2025 at 4:09:30 PM

My issue with deno and jsr is that it only exists at the discretion of VC funding.

That does not feel like a sustainable ecosystem where the incentives are misaligned (wanting a return on capital versus proper open engineering).

by azemetre

3/26/2025 at 10:43:06 PM

[dead]

by sieabahlpark

3/26/2025 at 7:53:56 PM

Put `ignore-scripts=true` in your .npmrc

by theteapot

3/26/2025 at 8:16:07 PM

that just delays the exploit. The exploit can still run next time you import the file.

by gruez

3/26/2025 at 8:27:04 PM

This mostly defends against name squatting and other malicious dependencies that never get imported.

Haven't reviewed code but article says its entry point is install script. install scripts don't run on import. I guess you saying it triggers from import too.

by theteapot

3/26/2025 at 11:29:31 PM

I am reading this while downloading half the internet doing a `cargo build` for my hello world program.

At least it's not the cursed javascript, right...

by BrouteMinou

3/26/2025 at 7:32:31 PM

Could we start a community review pool?

by megadata

3/26/2025 at 7:40:46 PM

Why does NPM always seem to have this kind of issues? Why do we rarely hear about similar problems with Maven Central, for example?

by dingi

3/26/2025 at 7:55:26 PM

I think a lot of it comes down to attack surface. JavaScript famously has a very limited standard library, so it is very common to pull in massive dependency chains of modules containing trivial functionality.

Contrast this to Java, C#, and Python, where you can write robust applications with just the standard libraries.

by dlachausse

3/26/2025 at 8:20:56 PM

Yep. See left-pad from from 2016. Something so trivial it was basically part of every other major language's standard library (and it is now in JS, but wans't at the time).

Other gems exist in the NPM world like is-odd, isarray, and is-negative-zero.

The whole ecosystem developed this absurd culture of micro-packages that all do something insanely trivial that's built-in to pretty much every other language. All of it is a result of trying to force the web into doing things it was never really designed for or meant to do.

Browsers were never supposed to be full application runtimes, and yet we (as an industry) keep doubling down on it

by thewebguyd

3/26/2025 at 9:02:05 PM

>Browsers were never supposed to be full application runtimes, and yet we (as an industry) keep doubling down on it

Web technologies are multi-platfrorm, widely accessible, has an unmatched UI API, has great tooling and a large ecosystem. It's the path of least resistance.

Contrast this with something like Qt, which is an absolute pain to compile and is riddled with bugs. It's GUI library is mature, but nowhere near the web.

Time is money, and developers are many times more expensive than the extra cpu cycles the web consumes.

by yupyupyups

3/26/2025 at 10:23:28 PM

> Contrast this with something like Qt, which is an absolute pain to compile and is riddled with bugs.

Compiling qt is a choice, there are many, many ways to get precompiled libs. Also, it is only hard to compile if you’re not familiar with ./configure, it is actually absurdly easy to compile and link against qt.

As for the bugs thing, sure, qt has bugs. So does _all other software_ and this is a very strange argument to make on your end.

by dgfitz

3/26/2025 at 10:43:02 PM

I would say, for serious development you should know how to compile the thing. Because some bugs cannot be worked around without actually patching the library.

Qt has many dependencies, and it's not always clear which libraries are needed for any particular functionality.

Try compiling the documentation and setting up the code examples properly in Qt Creator, it's not that easy and the instrucrions on how to do that are incomplete.

>As for the bugs thing, sure, qt has bugs. So does _all other software_ and this is a very strange argument to make on your end.

Most mature software do not expose their bugs early to their users. Qt has occationally some really silly bugs that pop up across versions that simply shouldn't be there if you have proper QA. When they kept away LTS updates from non-paying users, this I would argue risked becoming a bigger issue.

I don't want to talk bad about Qt, it's actually a really awesome library. But it has its quirks that are timeconsuming to work around, which can hinder wider adoption. That was all I was trying to say.

Note, I would actually recommend it if someone is interrested in building native applications with good performance, and is comfortable with C++, alternatively I would suggest a Python wrapper such as pyside to mitigate many of the issues I was talking about.

by yupyupyups

3/26/2025 at 8:56:36 PM

The culture of micro-packages is mostly pushed by people with financial interests.

The modus operandi is to add a semi-useful package to a popular project, and then divide said project into other sub-projects just to bump the numbers.

Then those individuals start claiming that they have "25 packages that are foundational to the web infrastructure", while in fact it's just a spinner that has 24 dependencies, some of those being one-liner packages.

Parallel to that we also have things like Babel, which has hundreds of packages, a huge chunk of them being almost empty and only triggering flags in the core package.

by whstl

3/27/2025 at 11:07:32 AM

> Browsers were never supposed to be full application runtimes

Unfortunately all the proprietary OS vendors want to prevent the development of fully portable runtimes, so we've ended up where the browser is the only fully portable application runtime.

by pjc50

3/26/2025 at 9:27:03 PM

To be fair Java has a pretty large package ecosystem as well, including a significant number of dependencies. Though the culture is very different, and you generally tend to avoid dependencies if possible. The idea of pulling in a package for testing whether a number is even or odd is incredibly foreign.

by marginalia_nu

3/27/2025 at 12:41:35 AM

Great question. A few reasons:

– The JavaScript ecosystem moves faster — way more packages, more frequent updates, and more transitive dependencies (avg 79 per package).

– npm has lower barriers to publishing, so it’s easier for malicious actors to get in.

– Java developers often use internal mirrors and have stricter review processes, while npm devs tend to install straight from the registry.

– But to be clear, supply chain attacks do happen in other ecosystems — they’re just underreported. We’ve seen similar issues in PyPI, RubyGems, and even Maven.

JavaScript just happens to be the canary in the coal mine.

by feross

3/26/2025 at 11:04:37 PM

Perhaps because nowhere else has one package for every 2,972 humans on planet Earth?

by robinsonb5

3/26/2025 at 7:45:31 PM

Biggest target had biggest number of issues?

by carlmr

3/27/2025 at 3:06:00 AM

The Javascript ecosystem is the problem and is completely immature by design.

by rvz

3/26/2025 at 8:29:10 PM

not sure about java land, but we have seen this on pypi and i think even on rubygems.

by johnny22

3/26/2025 at 7:27:08 PM

I think the industry is going to soon look back on building with Wild West open-source repos like we looked back on not having absolutely everything running on HTTPS in the Snowden era. I know Google has "assured" open source repos for Python and Java [1]. Are there other similar providers for those and other languages?

[1] https://cloud.google.com/assured-open-source-software/docs/o...

by tedd4u

3/26/2025 at 8:05:29 PM

Any reasonable company already knows this and sets up a proxy repo of scanned/approved versions (this is important for licensing too).

by giantg2

3/26/2025 at 9:45:01 PM

You're absolutely right, but you've just asserted that almost all companies making software are unreasonable.

Distressingly, doing what you suggest remains the exception by orders of magnitude. Very few people have internalized why it's necessary and few of those have the political influence in their organizations to make it happen.

by swatcoder

3/26/2025 at 10:19:01 PM

[flagged]

by nukem222

3/26/2025 at 9:01:29 PM

Not from what I’ve seen. What are the relevant products in this space? Can’t expect every random company to set up package scanning from scratch.

by pletnes

3/26/2025 at 9:35:28 PM

JFrog / Artifactory is one very common provider of private npm registries. There are a ton of security-scan vendors out there (mend/whitesource, socket, black duck...)

by chrisweekly

3/26/2025 at 9:31:48 PM

I worked for an IBM acquiree 13 years ago and as part of the "Blue-washing" process to get our software up to IBM spec we had to use their proprietary tools for verifying our dependencies were okay.

by tsm

3/27/2025 at 12:37:16 AM

Well then I wouldn't expect to do business with every random company. TPRM is a big issue today, so I wouldn't expect any company not performing basic due diligence to service.

by giantg2

3/26/2025 at 8:25:12 PM

How much is that automated scanning worth? Sure, we have mirrored repos, but I assume the malware authors pre test their code on a suite of detectors in CI. So infected packages will happily be mirrored internally for consumption.

by 0cf8612b2e1e

3/27/2025 at 12:37:23 AM

Totally agree. Most companies using mirrors or proxies like Artifactory aren’t getting much real protection.

- They cache packages but don’t analyze what’s inside.

- They scan or review the first version, then auto-approve every update after that.

- They skip transitive deps — and in npm, that’s 79 on average per package.

- They rely on scanners that claim to detect supply chain attacks but just check for known CVEs. The CVE system doesn’t track malware or supply chain attacks (except rarely), so it misses 99%+ of real threats.

Almost everything on the market today gives a false sense of security.

One exception is Socket — we analyze the actual package behavior to detect risks in real time, even in transitive deps. https://socket.dev (Disclosure: I’m the founder.)

by feross

3/26/2025 at 9:56:07 PM

Not much. as you say, static scanning is pretty much a dead end strategy. Exploiters have long since realized that you can just run the scan yourself and jiggle the bytes around to evade the signature detection.

by delusional

3/26/2025 at 8:38:40 PM

At least at my company, I think someone at least has to approve/verify the scan results. Of course it's still a risk, but so are external emails, vendor files, and everything else.

by giantg2

3/26/2025 at 8:38:32 PM

It is worth a fair bit. If you control the mirroring you can ensure the malware is flagged but not deleted, so forensics can assess how much damage has been done or would have been done, for instance.

by thibaut_barrere

3/26/2025 at 8:30:46 PM

>I assume the malware authors pre test their code on a suite of detectors in CI.

Maybe some do, but you give the average malware developer way too much credit.

by poincaredisk

3/26/2025 at 9:16:46 PM

Bugger all. We had something go straight through.

by ohgr

3/26/2025 at 7:50:43 PM

> npm is a package manager for the JavaScript programming language maintained by npm, Inc., a subsidiary of GitHub. -- [1]

and Microsoft own Github so Microsoft is the provider? Pretty sure they're running malware scanners over NPM constantly at the least. NPM also has (optional) provenance [2] to a Github build workflow which is as strong as being "assured" by Google IMO. Only problem is it's optional.

[1]: https://en.wikipedia.org/wiki/Npm [2]: https://github.blog/security/supply-chain-security/introduci...

by theteapot

3/26/2025 at 9:34:37 PM

This is a coordination failure. We have ways to distribute the source, but not the reviews. Every time someone does any level of reviewing that should be publishable too.

by the8472

3/26/2025 at 9:08:27 PM

> like we looked back on not having absolutely everything running on HTTPS in the Snowden era.

Apples and oranges and this is far, far worse.

You can absolutely ship signed, trusted code over standard HTTP. Microsoft did this for years and Debian and OpenBSD to name a few still do.

HTTPS does not assure provenance of code.

Anyone who doesn't understand this is very misinformed about what HTTPS does and doesn't do.

by donnachangstein

3/27/2025 at 5:47:55 AM

Sorry, I wasn't clear. I meant only in the general sense of in the not too far past, the industry was content with a huge hole like only running login under HTTPS and no site traffic, which in hindsight seems insane. What I mean is the situation (explored in the rest of this thread) where many in the industry seem to be content with consuming code extensively from public repos without many obstacles to prevent a supply-chain attack. What I'm saying is that soon the industry will probably look back on this in the same way: "what were we thinking!?"

by tedd4u

3/26/2025 at 10:17:04 PM

I think this depends on one's definition of "code"

by genewitch

3/26/2025 at 10:15:34 PM

> Wild West open-source repos

There's a deeper issue though. I frequently have difficult getting things to build from source in a network isolated environment. That's after I manually wrangle all the dependencies (and sub-deps, and sub-sub-deps, and ...).

Even worse is something like emscripten where you are fully expected to run `npm install`.

Any build process that depends on network access is fundamentally broken as far as I'm concerned.

by fc417fc802

3/26/2025 at 10:19:44 PM

Which is nearly all of them, except perhaps C/C++, that I can think of, in terms of languages broadly adopted

You can cache and/or emulate the network to go offline but fundamentally a fresh build in most languages will want to hit a network at least by default

by no_wizard

3/26/2025 at 10:59:04 PM

In my world (VHDL/Verilog and some C/C++) there's a difference between the "fetch" and "build" steps. It's perfectly reasonable for the fetch step to require network access; the build step should not.

The real problem is that some language ecosystems conflate those two steps.

by robinsonb5

3/26/2025 at 11:03:10 PM

I'm mostly on board with that dichotomy except that I think it's also important that all fetched artifacts either come from a VCS or are similarly cryptographically versioned and all historical versions made available in a reliable manner.

by fc417fc802

3/26/2025 at 11:10:55 PM

Yes, absolutely - I can't disagree with that.

by robinsonb5

3/26/2025 at 10:59:24 PM

> at least by default

Even in C/C++ after changing the relevant parameters to non-default values things often break. It seems those configurations often go untested.

Google managed repos are a nice exception to this. Clearly documented commit hashes for all dependencies for a given release.

by fc417fc802

3/26/2025 at 8:17:04 PM

So then instead of knowing nothing, we'll know that Google wants us to use it, which . . . is a different problem. :-)

by BrenBarn

3/26/2025 at 9:56:14 PM

How does https help with the problems Snowden uncovered? You don't run on https, https just does in transit encryption between 2 points of the service architecture. That is why you can (could?) slap cloudflare atop your http only site and get a padlock!

by nextts

3/26/2025 at 10:11:57 PM

Because one of the methods reported was scanning http packets. Easily read without ssl from any hop in the chain. More importantly, he blew the lid off the fact that governments had access to this via the very ISP’s everyone relies on for telecom. By making everything TLS, they can look all they want but they can’t read it.

You could do tls offloading at your load balancer but then you have to secure your entire network starting with your isp. For some workloads, this is fine, you aren’t dealing with super sensitive data. For others, you are violating compliance.

by reactordev

3/27/2025 at 5:57:47 AM

I'm referring to programs like MUSCULAR [1] and PRISM [2] where NSA was tapping inter- and intra-datacenter traffic of major internet communications platforms like Gmail, Yahoo Mail, Facebook etc. At the time, that kind of traffic was not encrypted. It was added in a hurry after these revelations.

[1] https://en.wikipedia.org/wiki/MUSCULAR

[2] https://en.wikipedia.org/wiki/PRISM

by tedd4u

3/27/2025 at 9:17:09 PM

Oh yeah where I work we run encryption between ec2s. But I don't think it is https. Probably more low level (todo: read up on how it works!)

by nextts

3/28/2025 at 10:59:59 PM

I saw it done with encrypted MPLS at a very large tech company.

by tedd4u

3/27/2025 at 12:30:12 AM

Totally agree — we’re going to look back and wonder how we ever shipped code without knowing what was in our dependencies. Socket is working on exactly this: we analyze the actual code of open source packages to detect supply chain risks, not just known CVEs. We support npm, PyPI, Maven, .NET, Rubygems, and Go. Would love to hear which ecosystems you care about most.

(Disclosure: I’m the founder. https://socket.dev)

by feross

3/26/2025 at 8:24:32 PM

If you include commercial offerings Red Hat has offered this for awhile, and many semi-successful startups have tried creating a business model solving this.

by Kaytaro

3/26/2025 at 9:16:11 PM

Based on the staff I see at the average technology company I wouldn’t expect this to get any better any time soon. The state of things is definitely declining.

by ohgr

3/26/2025 at 10:18:01 PM

People wonder why I run their shitty apps in VMs and nuke the VM afterwards.

This is why, lol.

by xyst

3/27/2025 at 10:13:39 PM

Sounds like Qubes OS, which is my daily driver.

by fsflover

3/26/2025 at 9:33:56 PM

What‘s the advice? Only develop in a sandbox environment? Otherwise chances are our main machines get compromised?

by submeta

3/27/2025 at 10:04:30 AM

Vet your dependencies, at least to the level of making sure your direct ones are actually real projects done by known people and reasonably widely used. Note that for all the buzz about these kinds of attacks, there's relatively little evidence they are actually successful at being downloaded and installed by anything but automated scanning/archiving systems.

by rcxdude

3/26/2025 at 9:30:33 PM

The open-source ecosystem's strength is also its weakness. Relying solely on community vigilance isn't cutting it anymore.

by distalx

3/27/2025 at 12:39:45 AM

Exactly. Linus’s Law — “given enough eyeballs, all bugs are shallow” — falls apart when everyone assumes someone else is doing the watching. In reality, most packages (and especially their transitive dependencies) get zero meaningful review. Attackers know this and exploit it. Community vigilance just doesn’t scale — we need better tools to actually inspect what code is doing.

by feross

3/27/2025 at 5:43:51 AM

I agree with you. To be fair though, the concept likely seemed more reasonable in 1999. Hardware, browsers, and websites (and their front- and back-end services) were all less complex back then. Also less bloat. Not that things were more secure, but a popular tool may have had more meaningful review.

At times, complexity is worth the trade-offs. Modern C++ compilers are more complex than ones in the 80s and 90s, but the assembly code they generate runs much faster. Rust is complex but provides massive security benefits while maintaining great performance.

At times though, stuff is just bloated or poorly designed.

But it's not always clear how to intelligently design a project. If you add too many features to a single large project, it becomes unwieldy to maintain that large project, and the harder it is to audit this critical piece of infrastructure. Yet, if you don't add enough features, people will use packages from random devs, risking their own security, while harming the maintainability of their own project.

I don't know how we solve that problem. Alternatively, you could ask devs to reinvent the wheel and write a lot more code on their own (which they probably won't, either because they don't want to, or because the employer requires a solution on too short of a timeline to do so), but that could also jeopardize security. Many if not most web devs have to deal with authentication and encryption, both of which (the overwhelming majority) very much should not do on their own. Good luck asking a junior dev to correctly implement AES-256 encryption (or something equivalent or better) on their own without using existing libraries.

The answer is almost certainly some kind of mix, but it's not clear what exactly that should look like.

by johnfernow

3/27/2025 at 8:58:47 PM

Downloading NodeJS modules and containers from "public" data sources that are either vetted or are vetted by unknown parties, isStupid == TRUE. Play stupid games, win stupid prizes. It's unfortunately not a problem specific to Node, goes back to at least the early Linux kernel and CPAN days. As GenZ has invaded Wall Street, they have started to pay attention to this concern and call it a "supply chain integrity" issue. Ironically, GenZ also leads the charge in most companies to "trust everything" because "everyone is doing it." (In most corporations, "perception is the reality" and if everyone perceives jumping off the cliff is The Way https://youtu.be/V-SJQdREDKM)

A solution is for ID.me to begin issuing developer certificates (for free to those offering themselves up as developers) and for the old system of public repositories and mirrors to be replaced by a single source of truth operated by a new entity jointly sponsored by the $15.4 Trillion worth of companies: Amazon, Google, Meta, Apple, nVidia, Tesla and Microsoft in such a way that everything in every repository can be traced back to the actual biometric signature of a developer and their related IRL context. These entity, its sponsors and ID.me also need to observe a regulation that they will NOT "track" or share information between each other (or others) about developers indexed off their developer certificates beyond what's necessary to host the repositories and to manage access to developer-only programs (such as Xcode signing, etc.) Ideally the JV will be located in Switzerland (perhaps with the hardware located in a Svalbard-like facility) using some UN supervised process (overseen by Nato, Russia, China and the Plebeians) to vet all workers.

by vaxman

3/28/2025 at 6:04:12 PM

*-either NOT vetted or were vetted by unknown/untrusted parties

by vaxman

3/26/2025 at 7:03:07 PM

Back in the day repositories had 'maintainers' who reviewed packages before they became included. I guess no one really cares in the web dev world; it's a free-for-all.

by JTbane

3/26/2025 at 7:13:39 PM

It’s not just web dev: go, rust, swift, ruby, python none of them do any checking.

In fact the only repo I know of doing any checking is Java’s Maven/Sonotype and it’s automated not manual.

by CamJN

3/26/2025 at 7:44:24 PM

I'm curious how much review happens in Nix packages. It seems like individual packages have maintainers (who are typically not the software authors). I wonder how much latitude they have to add their own patches, change the source repo's URL, or other sneaky things.

by bqmjjx0kac

3/26/2025 at 9:44:27 PM

Not a lot in most cases. You’re still just grabbing a package and blindly building whatever source code you get from the web. Unless the maintainer is doing their due diligence nothing.

Goes the same for almost all packages in all distros though.

I’d say most of us have some connection to what we’re packaging but there are plenty of hastily approved and merged “bump to version x” commits happening.

by bamboozled

3/26/2025 at 9:50:57 PM

Nixpkgs package maintainers don't usually have commit rights. I assume that if one tried to include some weird patch, the reviewer would at least glance at it before committing.

by jowea

3/27/2025 at 1:11:14 PM

I’ve never looked at the process of making a nixpkg, but wouldn’t the review process only catch something malicious if it was added to the packaging process? Anything malicious added to the build process wouldn’t show up correct? At least not unless the package maintainer was familiar and looked themself?

by c0wb0yc0d3r

3/29/2025 at 12:36:08 PM

I am not sure I understand the distinction between the packaging and build process, at least in the context of nixpkgs. Packages in nixpkgs are essentially build instructions, which you can either build/compile locally (like Gentoo) but normally you download them from the cache.

Official packages for the nixpkgs cache are built/compiled on Nix's own infrastructure, not by the maintainers, so you can't just sneak malicious code in that way without cracking into the server.

What package maintainers do is contribute these build instructions, called derivations. Here's an example for a moderately complex one:

https://github.com/NixOS/nixpkgs/blob/master/pkgs/applicatio...

https://github.com/NixOS/nixpkgs/blob/master/pkgs/applicatio...

As you can see, you can include a patch to the source files, add custom bash commands to be executed and you can point the source code download link to anywhere you want. You could do something malicious in any of these steps, but I expect the reviewer to at least look at it and build it locally for testing before committing, in addition to any other interested party.

by jowea

3/26/2025 at 7:22:57 PM

OCaml's opam does have a review process, although I'm not sure how exhaustive. It's got a proper maintenance team checking for package compatibility, updating manifests and removing problematic versions.

I don't think this would be viable if the OCaml community grew larger though.

by debugnik

3/26/2025 at 8:19:12 PM

Some alternative sources for other languages do it. Conda-forge has a process that involves some amount of human vetting. It's true that it doesn't provide much protection against some kinds of attacks, but it makes it harder to just drop something in and suddenly have a bunch of people using it without anyone ever looking at it.

by BrenBarn

3/26/2025 at 7:30:39 PM

And people bash c/c++ for not having some kind of central package management system. Hah!

by SunlitCat

3/26/2025 at 7:44:06 PM

IMO C/C++ is not much better, sure, no central package management system, but then people rewrite everything because it's too hard to use a dependency. Now if you do want to use one of the 1000 rewrites of a library, you'll have a lot more checking to do, and integration is still painful.

Painless package management is a good thing. Central package repositories without any checking isn't. You don't have to throw away the good because of the bad.

by carlmr

3/26/2025 at 8:04:05 PM

I have that in C++: we wrote our own in house package manager. Painless for any package that has passed our review, but since it is our manager we have enforced rules that you need to pass before you can get a new package in thus ensuring it is hard to use something that hasn't been through review.

I'm looking at rust, and that it doesn't work well with our package manager (and our rules for review) is one of the big negatives!

Note, if you want to do the above just use Conan. We wrote our package manager before Conan existed, and it isn't worth replacing, but it isn't worth maintaining our own. What is important is that you can enforce your review rules in the package manager not what the package manager is.

by bluGill

3/26/2025 at 7:57:19 PM

> Painless package management is a good thing. Central package repositories without any checking isn't.

There's a reason why these things come hand in hand, though. If the package management is so painless that everyone is creating packages, then who is going to pay for the thoroughly checked central repository? And if you can't fund a central repository, how do you get package management to be painless?

The balance that most language ecosystems seem to land on is painless package management by way of free-for-all.

by lolinder

3/27/2025 at 6:47:43 AM

>And if you can't fund a central repository, how do you get package management to be painless?

You could host your own package server with your own packages, and have the painless package manager retrieve these painlessly.

Of course we're in this situation because people want to see the painlessness with what other people built. But other people includes malicious actors every once in a while.

by carlmr

3/26/2025 at 8:27:16 PM

Correct me if I'm wrong but the usual advice in the C/C++ world is just grab the source code of any libraries you want and build them yourself (or use built-in OS libs). This is not great if you have a lot of dependencies.

by JTbane

3/27/2025 at 9:55:01 AM

Yeah! nothing like the xz backdoor could happen there! wait...

by rcxdude

3/26/2025 at 8:07:20 PM

Web dev has always been a hot mess.

by giantg2

3/26/2025 at 7:59:24 PM

Back in the day repositories had 'maintainers' who reviewed packages before they became included.

Then "walled garden" became a pejorative, and well… here we are.

by reaperducer

3/26/2025 at 8:03:48 PM

Pardon? Maybe my definition of that term is just different but that seems wholly unrelated.

by skeaker

3/26/2025 at 9:31:07 PM

I'm guessing we have to move to a setup where a developer enters code while being watched by an AI. Then the AI can give warnings if a line of code appears in the repository that wasn't on the developer's screen while they were looking.

by amelius

3/26/2025 at 9:40:39 PM

We will need an AI to watch the AI so that the first AI isn't working for the bad guy too

by delfinom

3/26/2025 at 9:42:33 PM

"I'm shocked, shocked... well not that shocked... "

It is the curse of all out-of-band package managers... where eventually some lamer shows up to ruin the fun for everybody. =3

by Joel_Mckay

3/26/2025 at 11:09:52 PM

The is the intersection of "user friendly" and "self installed." It exists anywhere this is tried.

The entire idea that 'postinstall' or 'preinstall' or any sort of _script_ needs to run after you've fetched a package of what should be /interpreted/ code is completely insane on any "modern" desktop OS.

From what I can tell the main use case is avoiding the problems from users who cannot figure out how to set an environment variable or reliably run a subshell.

by timewizard

3/30/2025 at 7:49:11 PM

[dead]

by hackburg

3/26/2025 at 7:10:37 PM

I think they should start scanning package with the help of AI.

by cute_boi

3/26/2025 at 7:24:49 PM

“Let’s 10x that shit”?

by user432678

3/26/2025 at 7:29:15 PM

Nah! Then we would need to add some blockchain and maybe sprinkle some other Buzzwords here and there for good measurement!

by SunlitCat

3/26/2025 at 8:11:44 PM

The AI scanner must use blockchain validation and of course, be written in Rust.

Actually, just rewrite all the packages on npm in rust and that will automatically get rid of any security problem.

by nottorp