4/25/2025 at 11:27:16 PM
It is an interesting tool. I've been struggling with Office Excel's inability to open large files. I always work with csv in python, and if a client must review the data in Excel, I take a random sample to generate a smaller file, then explain to the client that we can't open the whole in Excel. This really doesn't seem like a modern work."a slow, old pc with 8GB RAM"
By the way, this struck me like a humour of era. Oh god it has 8GB RAM. Cheers! To the good old days.
by tianqi
4/26/2025 at 7:51:18 AM
Fully agree, I have found that above 300k rows Excel struggles even on a good laptop. Not even mentioning the Python integration into MS Excel that is so unbearably slow that it is much better performing the calculations outside of Excel first.I am sold on the website looks and license model!
by samzub
4/26/2025 at 9:04:16 AM
These days I use DuckDB to read massive excel files. DuckDB now ships with a nice local UI and it also works beautifully with Datagrip, my preferred database IDE. With SQL, it just becomes a matter of applying old grease to do whatever analysis I want.by microflash
4/26/2025 at 1:49:26 AM
I've been using Tad [0] for this purpose, as it streams in the data.by arvindh-manian