3/29/2026 at 8:15:33 AM
The thing that has been bothering me for a while is that the USB spec allows for software detection of capabilities. You can read the emarker data and see the supported protocols, speeds, voltages, etc.But there is not standard for usb controllers to present this data to the OS. So it’s stuck in the low level firmware and never passed up. In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
Apple seems best able to do this since they control the hardware and OS, yet they aren’t doing it either. Users are just left to be confused about why things are slow.
by Gigachad
3/29/2026 at 8:25:41 AM
> In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.I'm pretty sure my old Dell XPS laptop with Windows 10 had pop-ups just like this.
"This device can run faster" or something.
by avian
3/29/2026 at 9:50:23 AM
AFAIK that's just when plugging in a USB 3 device into a USB 2 port or using a USB 2 cable.by Vogtinator
3/29/2026 at 4:51:27 PM
> that's just when plugging in a USB 3 device into a USB 2 portDell XPS laptops (and some others) can also warn if the charger isn't providing the full wattage the laptop is rated for. This warning is an option that can be turned off in the BIOS settings.
I usually turn it off because I sometimes intentionally do day trips with a smaller/lighter portable charger which delivers 45w to my laptop which can need up to 65w due to having a discrete GPU. However, 45w is more than sufficient to charge the laptop during normal use on the Balanced power plan with iGPU. I only need more than 45w when gaming with the discrete GPU active.
by mrandish
3/29/2026 at 5:46:33 PM
Just this morning, my old Latitude failed to boot with a “this charger is only giving 20W and that’s not enough to boot this laptop” error. (I was testing a new USB-C charger that’s obviously going back.)Weirdest part was it was 100% charged, so could have booted with 0 Watts of charger but decided not to boot with 20 Watts more.
by sokoloff
3/29/2026 at 5:55:30 PM
Oh, refusing to boot at all is evil. I've never seen that.Sure, you or I would just unplug the charger and run on battery but bad UX decisions like that generate a support call to me from my 95 yr old mom. It should not only warn and continue to boot, it should use whatever power is on offer to reduce the rate of battery drain.
by mrandish
3/29/2026 at 9:41:05 PM
Interesting that it refused to boot.If I have a lower wattage charger connected on booth it shows me that information but I can just press enter to continue. It's just a warning.
Maybe it's a bios setting?
Workaround is of course to boot without a charger connected and then connect it later :)
by koyote
3/29/2026 at 6:57:07 PM
My wife's work laptop gives this stupid warning anytime any USBC charger is plugged in, other than the Dell brick. So even a dock delivering 100w would get a complaint. The Dell brick offers non-standard charging at 140w, which can't get replaced by standards compliant, smaller chargers.by mleo
3/29/2026 at 8:51:06 AM
Even Apple now has one of those, when you plug something into the USB 2 port on the MacBook Neo.by LoganDark
3/29/2026 at 12:17:47 PM
There’s still nothing when you plug a usb3 device in using a usb2 cable.by Gigachad
3/29/2026 at 11:49:21 AM
I wonder if it's possible for a regular machine with two high speed ports to do a cable test by itself. Maybe it can't test all the attributes but could it at least verify speed claims in software?by imglorp
3/29/2026 at 5:01:47 PM
Apparently the USB driver stack doesn't report the cable's eMarker chip data back to the OS. However benchmarking actual transfer throughput is the ultimate test for data connections (vs charging use cases). Unfortunately, TFA doesn't really go into this aspect of cable testing as the tester seems to only report eMarker data, which pins are connected and copper resistance.Since a >$1,000 automated lab cable throughput tester is overkill, my thumbnail test for high-speed USB-C data cables is to run a disk speed benchmark to a very fast, well-characterized external NVMe enclosure with a known-fast NVMe drive. I know what the throughput should be based on prior tests with an $80 active 1M Thunderbolt cable made for high-end USB-C docks and confirmed by online benchmark reviews from credible sources.
by mrandish
3/29/2026 at 12:24:01 PM
There would be too many factors involved for a proper test. Many laptop USB controllers would probably not even have the capacity to run two ports at full speed simultaneously.by Gigachad
3/29/2026 at 2:40:39 PM
I strongly suspected my old xps had nonstandard things going on with its USB C chargerby colechristensen
3/29/2026 at 4:34:09 PM
Perhaps someday it will earn the same level of importance as charging; iOS 26 calls out slow chargers on their iPhones, so you can run to the Apple Store and buy a fast one!They probably have to weigh potential new hardware sales against added complexity. I have counterpoints too but: I believe they try to protect users’ mental models of their ecosystem (which perhaps I appreciate when I don’t notice, and can’t stand when something is uncustomizable). Like there are enough variables they don’t trust us with as it is.
by Barbing
3/29/2026 at 7:35:00 PM
> iOS 26 calls out slow chargers on their iPhones, so you can run to the Apple Store and buy a fast one!You jest but that notification (it's been a thing on Android for at least 8 years, and on thinkpads for at least 10) has been very helpful to me. Sometimes the negotiation just fails and being told is helpful. Sometimes the charger lies about its specs and once again it's helpful to have a hint, rather than expect everybody to systematically have usb testers on hand.
by tredre3
3/29/2026 at 7:53:16 PM
This one is pretty simple to do. It requests a voltage and then starts pulling current and monitors the voltage as it increases its current draw. If the voltage goes down, alert the user.With data speed I think it could be a little more complicated. Like OP was saying it would need access to some level of hardware information where it can see which pins are used by the cable. Since the connection 'speed' is still variable even when you DO have a supported cable.
by s3p
3/29/2026 at 8:21:32 PM
I do jest, it’s a great feature. I never considered charger negotiation failure!by Barbing
3/30/2026 at 7:03:45 AM
It's fun to see the Treedix tester come up on HN. I got one a few months back and have quite enjoyed using it. One thing which I did find interesting was that one of the cables had the emarker data lie. IIRC, the emarker data would suggested it supported much higher speeds and wattage than it did. Fortunately, the other testing screens successfully detected it only had USB 2.0 wires even though it claimed to support 40 GB/s.by EnnEmmEss
3/29/2026 at 12:07:29 PM
> But there is not standard for usb controllers to present this data to the OS. So it’s stuck in the low level firmware and never passed up. In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.There is. I used to use a KVM with USB 2 ports connected to my PC's USB 3 port, to which I connected a monitor with integrated USB 3 hub to drive my keyboard and mouse. Windows would show a popup every time telling me that I should use a faster cable.
There are also popups telling me that my laptop is connected to a "slow" usb-c charger.
by vladvasiliu
3/29/2026 at 3:32:55 PM
That’s quite a simplistic one unfortunately - USB 2 and 3 use different controllers in the PC, which it can indeed detect. The sub-flavours of 3/4 less so.by bdavbdav
3/29/2026 at 2:04:08 PM
I've used all manner of archaic usb cables for data transfer when in a pinch and windows has never shown me anything at all. Could it be the external device you were connecting to triggering the windows notification?by wholinator2
3/29/2026 at 1:17:48 PM
I have seen these kinds of notifications on occasion but they are far from the norm.by Forgeties79
3/30/2026 at 5:26:43 AM
Android must have this in some form. My Pixel phone with a third-party app can show the voltage and current mode selected via the PD protocols.Using DevCheck might show 2.2A/9V as an example.
by prism56
3/29/2026 at 11:00:36 AM
On iPhone, when connecting an external MIDI device via USB, the phone told me that the device was drawing too much power and would be disabled.I don’t know if they check that via USB protocol, or if they are measuring the actual power draw on the USB port.
In order to use the device, I had to connect it via an externally powered USB hub.
by QuantumNomad_
3/29/2026 at 9:28:37 AM
I suspect most users do not even realise things are slow.by graemep
3/29/2026 at 10:44:48 AM
Oh, they very much do. But like with everything in technology, they can do fuck all about it, so they resign and maybe complain to you occasionally if you're the designated (in)voluntary tech support person for your family and friends.Regular people hate technology, both for how magical and how badly broken it is, but they've long learned they're powerless to change it - nobody listens to their complaints, and the whole market is supply-driven, i.e. you get to choose from what vendors graciously put on the market, not from what the space of possible devices.
by TeMPOraL
3/29/2026 at 11:42:04 AM
They also tend to hate technology, because us nerds are often unbearable.They hate having to go through people that get them upset, in order to use their kit.
Not just tech (although it’s more prevalent). People who are “handy” can also be that way (but, for some reason, techies tend to be more abrasive).
I’ve learned the utility of being patient, and not showing the exasperation that is often boiling inside of me.
by ChrisMarshallNY
3/29/2026 at 5:24:43 PM
Amen. I couldn’t have said it better.In general for the 40+ years I’ve been a programmer I have detested the practice of not surfacing diagnostic information to users when technology makes it possible to do so in a clear and unambiguous way.
by tomcam
3/29/2026 at 6:16:10 PM
Most users tend to ignore diagnostic information."What did the error message say"
"I don't know."
by graemep
3/29/2026 at 7:11:19 PM
This is because error messages have historically been bad, unintelligible, un-actionable, and hard to separate from soft errors that don't actually matter.'Segmentation fault. Core dumped.'
'Non-fatal error detected. Contact support.'
'An error occurred.'
'An illegal operation was performed.'
'Error 92: Insufficient marmalade.'
'Saving this image as a JPG will not preserve the transparency used in the image. Save anyway?'
'Saving as .docx is not recommended because blah-blah-blah never gonna give you up nor let you down.'
I can't blame any normal user from either not understanding nor giving a shit about any of these. If we'd given users actionable information from day 1, we'd be in a very different world. Even just 'Error 852: Couldn't reach the network. Check your connection to the internet.' does help those who haven't turned of their brains entirely yet.
by Telaneo
3/29/2026 at 7:53:13 PM
30 or so years back, one of the Mac magazines had a customer support quote along these lines:"I don't understand, it says 'System Error Type 11', and no matter how many times I type 11, nothing happens!"
by ben_w
3/29/2026 at 8:54:07 PM
Now imagine if that error said 'Error 11: A memory error occurred. Your program may be faulty or misbehaving. Contact your software vendor." That's miles better than what most things provide.That one's a good example of why these things are hard. The user could have been running 5 different programs, any one of which caused this error, and MacOS can't point the finger at anyone. Not to mention that the problem could be MacOS itself, or the user being a dunce who misconfigured something. I'm not sure if that error can occur without 3rd party software being involved, but if it can, then that error message might need to be even more vague, helping the user even less. Not to mention it could just be faulty hardware.
A paper manual offering troubleshooting steps for each error would be really helpful. Just 'Error 11. Consult your manual.' and the manual actually telling you what the problem could be is also miles better than what we usually get.
by Telaneo
3/30/2026 at 12:36:09 AM
> The user could have been running 5 different programs, any one of which caused this error, and MacOS can't point the finger at anyone.It's still an example why it's worth giving your users a fighting chance. MacOS may not know enough to point the finger at anyone, but the user knows what they were doing at that moment, and even if they were not paying attention, they might start now. They'll realize if something is off. Or, after 10th time they get this error, they'll connect the dots and realize it's always happening when application X is running and they try to launch Y.
Or maybe sometimes they won't. Maybe they'll form a story and maybe it'll be all bullshit, or maybe good enough. Either way, the important part is, the user retains agency in the process. Giving people information is how they can become self-sufficient users and trust technology more.
by TeMPOraL
3/29/2026 at 9:06:41 PM
This was 30 years ago, it was Mac OS classic with co-operative multitasking and zero inter-process memory protection, when the error comes up the only option was "restart" (the computer, not the task).by ben_w
3/29/2026 at 9:14:38 PM
I know.by Telaneo
3/29/2026 at 9:00:38 PM
The author Terry Pratchett had some of best error messages in his Discworld novels. The Hex computer could produce the following++?????++ Out of Cheese Error. Redo From Start.
+++ Divide By Cucumber Error. Please Reinstall Universe And Reboot +++
+++Whoops! Here comes the cheese! +++
by stoneman24
3/29/2026 at 11:46:02 PM
Don't blame this one on programming techies. This one is ALL the fault of shitty UI designers abusing modal dialog boxes.A modal dialog is supposed to be for something damn near irreversible--like about to blow away your application because of error. You are supposed to STOP and go get the guru or you are about to lose, badly.
Unfortunately, UI designers throw them up for everything and people get used to simply clicking "OK" to make them go away so that they can get back to doing their task. So, when the user gets an actual error, they've already blown away the dialog box with information.
Your 'Saving this image as a JPG will not preserve the transparency used in the image. Save anyway?' line is a horrifically excellent example. That is a standard "Save As..." response, and it should NEVER have been. That should have always been under "Export..." as saving should never throw away information and it would be perfectly fine to regenerate a JPG as long as you have the full information still available in the original file.
This is the stuff that infuriates me about the UI designers. Your job is about interactions, first, and pixels, second.
by bsder
3/30/2026 at 4:00:59 AM
This is because error messages have historically been bad, unintelligible, un-actionable, and hard to separate from soft errors that don't actually matter.
And they've only got worse: "Something went wrong". Well no shit Sherlock, I can tell something went wrong because the thing I tried to do didn't work. Possibly the single most useless error message every created, and it's everywhere. Most of the worst-case error messages in the quoted response are still better than this one.If you ever run into a developer who thinks "something went wrong" is an appropriate error message, have them killed. Then kill their entire family and pets, burn their house down, and plough salt into the ground where it stood. Finally, put up a sign that says "The person who used to live here thought 'something went wrong' is an appropriate error message to display when something goes wrong. Take note of their current situation when you next add an error message to your software".
by pseudohadamard
3/29/2026 at 1:52:41 PM
I had a programmer pushing multi-gig packages to a Meta Quest 3; and it was taking around a minute. He didn’t even think that it could be faster because he assumed the Quest or software was slow and didn’t check.I implored him to try a different cable (after checking cables with the Treedix mentioned in TFA), and the copy went from taking over a minute to about 13s.
Its not just normal people confused.
by dijit
3/29/2026 at 3:51:48 PM
I find some programmers (and this is presumably true of any industry) very narrow in their expertise within technology.by bdavbdav
3/29/2026 at 7:46:11 PM
Yeah, most programmers are not curious hackers anymore. They are 9-5 white collar workers with hobbies far outside of programming, systems, hardware, etc. It shows very much as soon as you meet one of them. But, like you said, this is true of any industry.Oh, and pointy jab: these folks are also, in my opinion/experience, the most eager to vibecode shit. Make of that what you will.
by lpcvoid
3/29/2026 at 7:56:37 PM
"anymore"? Over a decade ago, a coworker had a path for updating some app's files to a database, and it was taking something like 10 minutes on certain test inputs.Swore blind it couldn't be improved.
By next morning's stand-up, I'd found it was doing something pointless, confirmed with the CTO that the thing it was doing was genuinely pointless and I'd not missed anything surprising, removed the pointless thing, and gotten the 10 minutes down to 200 milliseconds.
I'm not sure if you're right or wrong about the correlation with vibe-coding here, but I will say that co-workers's code was significantly worse than Claude on the one hand, and that on the other I have managed to convince Codex to recompute an Isochrone map of Berlin at 13 fps in a web browser.
by ben_w
3/29/2026 at 8:03:34 PM
I do feel like the industry has taken a nosedive quality wise over covid in particular. Lots of new people only in tech for the money, no deep idea about computers.But I know stories like yours from a decade past as well. A tale old as time, but compounding in recent years - IMHO.
by lpcvoid
3/29/2026 at 8:14:35 PM
Could be, but I think the rot I see now predates the pandemic, possibly with reactive, possibly even before then: https://benwheatley.github.io/blog/2024/04/07-21.31.19.htmlby ben_w
3/30/2026 at 12:40:10 AM
I blame it on "software eating the world" (in general) - at some point, about two decades ago, it started to become obvious to everyone that programming is the golden ticket to life - an easy desk job paying stupid amounts of money, with no barriers to entry. So very quickly the pool of students, and then employees, became dominated by people who joined in for the pay, not because of interest in technology itself.by TeMPOraL
3/30/2026 at 4:07:29 AM
Obligatory link: https://thedailywtf.com/. It's full of stories like this.by pseudohadamard
3/29/2026 at 12:22:13 PM
I think you are right, but I think what I said is also true.People will notice some things. For example, with USB if they are using it for local backup they might notice, but with a lot of devices they will not. When they do notice, they will feel powerless.
Even if we had a wider choice, they are not well placed to pick products. There is no way they will know about details of things such as USB issues (a cable is slow, the device will not tell you if it is) at the time of purchase.
by graemep
3/29/2026 at 1:19:23 PM
I think any of us just have to look at how many people ask us for recommendations on basic things like docks and adaptors to see how common this is. On top of that you can’t even trust what’s on the tin sometimes.by Forgeties79
3/29/2026 at 12:21:24 PM
This is true of basically everything. Even trivial home maintenance people will just put up with things being broken most of the time over learning how to fix them.by Gigachad
3/29/2026 at 7:48:09 PM
I've lived in this apartment for about a year and a half. It took me until last week to put up lights over the stairs. I've been walking on the stairs in the dark, some times using my phone as a light.I'm an electrician.
by encom
3/29/2026 at 8:57:06 PM
Physician, heal thyself. The cobbler's children have no shoes.by Telaneo
3/30/2026 at 3:23:43 AM
Does it matter, for anyone other than hardcore geeks? All the OS would care about is how much power can it deliver and what data speed it provides, not whether the exception handling on page 4,096, section 4(a)2.1, paragraph 4 of the spec, has been implemented.by pseudohadamard