alt.hn

4/1/2026 at 11:25:29 PM

Obfuscation is not security – AI can deobfuscate any minified JavaScript code

https://www.afterpack.dev/blog/claude-code-source-leak

by rvz

4/2/2026 at 12:19:23 AM

JS was never really obfuscated - it wasn't the goal of minification. Minifiers especially struggle with ES6 classes/etc, outputting code that is almost human readable.

Proper obfuscation libraries exist, typically at the cost of a pretty notable amount of performance that I'd wager most are not willing to sacrifice

And like even the best of client-side DRM, everything can be reverse engineered. All the code has been downloaded to the user's machine. It's one of the (IMO terrible) excuses for the SaaSification of all software

by maxwg

4/2/2026 at 1:39:09 AM

minification was originally about sending less bytes on the wire and saving a bit of performance. Somewhere along the road people started misusing this for security, because JS evolved from "a few snippets of code to make my site more interactive" to SPAs

by integralid

4/2/2026 at 12:24:24 AM

> AfterPack approaches this differently. Instead of layering reversible transforms on top of each other, AfterPack uses non-linear, irreversible transforms — closer to how a hash function works than how a traditional obfuscator works. The output is functionally equivalent to the input, but the transformation destroys semantic meaning in a way that cannot be reversed — even by AfterPack itself. There's no inverse function. No secret key that unlocks the original.

That’s probably fun when trying to analyze bugs occurring in production. :)

by layer8

4/2/2026 at 12:37:04 AM

What they’re describing is a polymorphic virus. A great analogy for SV startups.

It works great in assembly, not so much for higher level languages.

by throwup238

4/2/2026 at 1:16:21 AM

Is all polymorphic code virii?

by brookst

4/2/2026 at 1:26:52 AM

Not necessarily, but in practice no one has any use for the technique except to obfuscate viruses, with the exception of academic research.

The nonvirus equivalent is JITs which are present in all major browsers and tons of other runtimes, but they have no use for polymorphism except at a theoretical level (they all use it extensively, but at the type level).

by throwup238

4/2/2026 at 4:54:47 AM

I fought with polymorphic code quite a bit back when I was removing copy protection (many decades ago). There may be other cases where making debugging hard is desirable.

by brookst

4/2/2026 at 12:23:09 AM

Minification is not obfuscation and obfuscation is not security, but no amount of deobfuscation will recover the comments in the source, which are often more insightful than the source itself.

by Retr0id

4/2/2026 at 1:15:40 AM

Obfuscation is meant to slow someone by making it difficult to understand. Slowing an attacker down is often employed as a form of security, that is why castles had walls, moats and multiple layers once you got inside to hinder progress.

It has been often used by companies, malware authors etc. to make it difficult for someone else to understand what is internally happening.

by JollySharp0

4/2/2026 at 1:35:45 AM

Often like 1 in 100 js files?

by postalrat

4/2/2026 at 12:26:50 AM

If the comments were in the original source that the model trained on... Then sure, those are recoverable too.

by TurdF3rguson

4/2/2026 at 12:14:50 AM

It's a cat and mouse game, it provides the desired level of security for people who use it. It isn't used to prevent people from finding vulnerabilities (not mostly at least). It's used to deter competition, prevent clones of the application,etc.. it's make-shift "DRM". There are ways to defeat even AI-assisted analysis running in a proper browser. But I think it's not a good idea to give anyone ideas on this subject. proper-DRM is hellish enough.

Was there ever an obfuscated JS code a human couldn't reverse given enough time? It's like most people's doors, it won't stop someone with a battering ram, but it will ideally slow them down enough for you to hide or get your guns. in this case, it won't even slow them down, until it does (hence: cat and mouse game).

by notepad0x90

4/2/2026 at 1:43:12 AM

>Was there ever an obfuscated JS code a human couldn't reverse given enough time?

I reverse malware for a living and no there wasn't. With some experience even the best obfuscation is actually pretty easy to defeat. But the goal of malware analysis is to extract some knowledge (what this code does, IPs, URLs, tokens). Getting a runnable, clean version would often be a long tedious work.

by integralid

4/2/2026 at 12:17:57 AM

Huh? Their justification for "ofuscation isn't security" is by pointing out that the Claude source wasn't obfuscated, it was minified. And it could be "deobfuscated by claude itself" - even though, again, they said the code wasn't obfuscated.

So I guess, ask Claude to deobfuscate some code that's ACTUALLY OBFUSCATED if you want to claim obfuscation provides ZERO additional security.

>We analyzed this file at AfterPack as part of a deobfuscation case study. What we found: it's minified, not obfuscated.

>Here's the difference. Minification — what every bundler (esbuild, Webpack, Rollup) does by default — shortens variable names and removes whitespace. It makes code smaller for shipping. It was never designed to hide anything.

>Here's where it gets interesting. We didn't need source maps to extract Claude Code's internals. We asked Claude — Anthropic's own model — to analyze and deobfuscate the minified cli.js file.

by tw04

4/2/2026 at 1:55:25 AM

It seems pretty clearly ai written.

by what

4/2/2026 at 12:12:50 AM

I successfully did this the other day. There was a web app I used quite a bit with an annoying performance issue (in some cases its graphics code would spin my CPU at 100% constantly, fans full-blast). I asked Claude to fetch the code and fed it a few performance traces I took through Firefox, and it cut through all those obfuscated variables like they weren't even there, easily re-interpreting what each function actually did, finding a plausible root cause and workaround (which worked).

Can you generally trust it to de-obfuscate reliably? No idea. My sample size is 1.

by ryandrake

4/2/2026 at 12:26:08 AM

I did something similar yesterday. I'm playing a little idle game, and wanted to optimise my playthrough. I pointed claude at the game's data files, and in a few short minutes it reverse engineered the game data and extracted it to CSV / JSON files for analysis.

In this case, it turned out the data - and source code for the game - was in a big minified javascript file. Claude extracted all the data I wanted in about 2 minutes.

by josephg

4/2/2026 at 12:16:42 AM

[dead]

by quangtrn

4/2/2026 at 1:23:29 AM

[dead]

by danelliot

4/2/2026 at 12:15:24 AM

The _any_ part is not clear to me. Obfuscation is an arms race. Reverse engineers have always been tool-assisted. Now they just have new tools and the obfuscators need to catch up.

by 0x3f

4/2/2026 at 1:46:03 AM

Nicholson entered the mantrap and the double doors closed behind him. He emptied his pockets and disrobed before donning the clean suit that had been provided to him by the orderlies. The camera watching him appeared satisfied that he was properly prepared and, more to the point, that the vendor was properly protected. The doors to the inner chamber opened and he proceeded into the hallway. He passed several doors until he reached the one that was labeled with the name of the vendor. He pressed the button on the doorframe. A satisfying tactile click, a spinning light illuminating around the button, a click, and then the door opened soundlessly. A single desk with a small chair and a computer terminal awaited him. He sat down and the screen turned on automatically. Finally, he was able to set about classifying his expenses from a recent trip to Tokyo. It was inconvenient, but a small price to pay to ensure that the vendor’s unique interfaces, their intellectual property, couldn’t be copied by the replication machines. Their eyes and their ears were everywhere in the outside world. Simply by seeing your software, these machines could copy its essence. The risks of operating software in the wild required that proprietary software be protected. Hidden away from eavesdroppers. Such was the world in 2037.

by throwaway9980

4/2/2026 at 12:17:39 AM

And read through native code as well

by socalgal2

4/2/2026 at 2:04:05 AM

About 8 months ago I deliberately obfuscated some merge sort code to give to my software engineering students to make them upset. :) I munged it up pretty good, changing the structure, making the variable names completely misleading, destroying the symmetry, etc. Out of curiosity, I fed it into ChatGPT and it had it figured out in zero seconds.

by beej71

4/2/2026 at 12:28:53 AM

> No one talks about this. There's no VentureBeat headline about GitHub shipping email addresses in their JS bundles. No Hacker News thread about internal URLs exposed in Anthropic's CDN scripts

That's a huge sign none of that information is truly sensitive. What is being implied here?

> AI Makes This Urgent

No it doesn't. This is blogspam and media hype nobody is interested in. Unless the demographics have really shifted that much in the last few years, HN is one of the worst places to attempt this marketing style.

by sublinear

4/2/2026 at 1:42:06 AM

JavaScript code is the essence of minified security.

by mediumsmart

4/2/2026 at 12:13:09 AM

write your blog yourself if ppl are supposed to read it not this llm slop

by durzo22

4/2/2026 at 12:15:37 AM

isn't it fair for an article about AI deobfuscating code to be written by AI?

by notepad0x90

4/2/2026 at 12:19:15 AM

If it’s too hard to read ask your ai to deobfuscate it :D

by socalgal2

4/2/2026 at 12:41:18 AM

Fair? No. Par for the course? Unfortunately yes.

by gertop

4/2/2026 at 12:19:18 AM

I expect it these days but it’s still disrespectful slop pushing out real work.

by Gigachad

4/2/2026 at 12:21:56 AM

Not really, no.

by Retr0id

4/2/2026 at 12:34:42 AM

slight historical note, it might be interesting to see how the brief period of "white box cryptography" stands up to AI today. At the time there were a few companies with products that had trouble finding fit (for straightforward security reasons) but they were essentially commercial obfuscators that made heavy use of lookup tables, miniature virtual machines, and esolang concepts that worked mainly against human reverse engineers.

An example was this early AES proposal: https://link.springer.com/chapter/10.1007/3-540-36492-7_17

by motohagiography

4/2/2026 at 12:46:04 AM

Whitebox cryptography is widely deployed, in browser plugins for DRM.

by Retr0id

4/2/2026 at 12:13:32 AM

[dead]

by techpulselab