4/1/2025 at 4:38:13 PM
Seems to be a change in Cloudflare's managed WAF ruleset - any site using that will have URLs containing 'camel' blocked due to the 'Apache Camel - Remote Code Execution - CVE:CVE-2025-29891' (a9ec9cf625ff42769298671d1bbcd247) rule.That rule can be overridden if you're having this issue on your own site.
by tom_usher
4/1/2025 at 7:52:19 PM
> any site using that will have URLs containing 'camel' blockedWhat engineer at cloudflare thought this was a good resolution?
by internetter
4/1/2025 at 8:03:35 PM
I doubt the system is that simple. No one wrote a rule saying `if url.contains("camel") then block()` it's probably an unintended side-effectby Raed667
4/1/2025 at 9:19:47 PM
If this is a bet, I'll happily take the other side and give you 4:1 on it.by keithwhor
4/1/2025 at 9:28:25 PM
Me too.by dgfitz
4/1/2025 at 10:13:22 PM
Akamai has been doing precisely that for years & years...by ycombinatrix
4/2/2025 at 12:36:30 AM
I think you can include advertising/privacy block lists in that vein too, although that allows for the users to locally-correct any issues.by benoau
4/2/2025 at 4:34:14 PM
Judging by previous outages it was probably a poorly tested overcomplicated regex which matched to much.by isbvhodnvemrwvn
4/2/2025 at 3:21:10 AM
[dead]by TacticalCoder
4/1/2025 at 5:05:20 PM
Confirmed here: https://www.cloudflarestatus.com/incidents/gshczn1wxh74by cbovis
4/1/2025 at 6:38:23 PM
WAFs are so shitby oncallthrow
4/1/2025 at 7:57:42 PM
WAFs are literally "a pile of regexes can secure my insecure software"by ronsor
4/1/2025 at 9:07:57 PM
To be fair to WAFs, most are more than just a pile of regexes. Things like detecting bot traffic - be it spammers or AI scrapers - are valuable (ESPECIALLY the AI scraper detection, because unlike search engines these things have zero context recognition or respect for robots.txt and will just happily go on and ingest very heavy endpoints), and the large CDN/WAF providers can do it even better because they can spot shit like automated port scanners, Metasploit or similar skiddie tooling across all the services that use them.Honestly what I'd _love_ to see is AWS, GCE, Azure, Fastly, Cloudflare and Akamai band together and share information about such bad actors, compile evidence lists and file abuse reports against their ISP - or in case the ISP is a "bulletproof hoster" or certain enemy states, initiate enforcement actors like governments to get these bad ISPs disconnected from the Internet.
by mschuster91
4/2/2025 at 5:22:54 AM
Why would scrapes get blocked, is scrapping illegal?by randunel
4/2/2025 at 7:17:00 AM
It's very often not, but it's still the website owners property and if they choose so, they can show misbehaving guests the door and kindly ask to remain on the other side (aka block them). Large scale scraping puts substantial burden on web properties. I was paged the other night because someone decided it would be a great idea to throw 200 000rq/s for a few minutes at some publicly available volunteer run service.by Xylakant
4/2/2025 at 6:15:15 AM
I don't know if it is, but I also don't think we are required to let dumb bots repeatedly assault or web sites if we can find a technical way to get around it.by eitland
4/2/2025 at 11:26:56 AM
They do mitigate known vulnerabilities.by cluckindan
4/3/2025 at 9:02:29 AM
They may mitigate known proofs of concept of vulnerabilities, and require a small amount of creativity to work around. At the cost of randomly breaking things.by rcxdude
4/1/2025 at 11:40:43 PM
But are they less shit than the shitty software they filter traffic for?by UltraSane