alt.hn

4/21/2025 at 4:36:57 PM

Show HN: Light like the Terminal – Meet GTK LLM Chat Front End

https://github.com/icarito/gtk-llm-chat/

by icarito

4/21/2025 at 6:56:49 PM

It’d be better if it was written in C or at least Vala. With Python, you have to wait a couple hundred milliseconds for the interpreter to start, which makes it feel less native than it can be. That said, the latency of the LLM responses is higher than the UI, so I guess the slowness of Python doesn’t matter.

by guessmyname

4/21/2025 at 7:06:28 PM

Yeah I agree, I've been thinking about using Rust. But ultimately it's a problem with GTK3 vs GTK4 too because if we could reuse the Python interpreter from the applet that would speed things up but GTK4 doesn't have support for AppIndicator icons(!).

I've been pondering whether to backport to GTK3 for this sole purpose. I find that after the initial delay to startup the app, its speed is okay...

Porting to Rust is not really planned because I'd loose the llm-python base - but still something that triggers my curiosity.

by icarito

4/21/2025 at 7:46:37 PM

What's the startup time now with 9950X3D, after a prior start so the pyc's are cached in RAM?

by cma

4/22/2025 at 4:26:07 PM

Hey I felt bad that there was a longer delay and by making sure to lazy-load everything I could, I managed to bring down the startup time from 2.2 seconds to 0.6 on my machine! Massive improvement! Thanks for the challenge!

by icarito

4/23/2025 at 2:50:00 AM

nice that's a huge difference

by cma

4/21/2025 at 9:10:47 PM

With a laptop 7735HS, using WSL2, I get 15ms for the interpreter to start and exit without any imports.

by cma

4/21/2025 at 10:23:40 PM

I've got a i5-10210U CPU @ 1.60GHz.

You triggered my curiosity. The chat window takes consistently 2.28s to start. The python interpreter takes roughly 30ms to start. I'll be doing some profiling.

by icarito

4/21/2025 at 7:58:06 PM

I wonder! In my more modest setup, it takes a couple of seconds perhaps. After that it's quite usable.

by icarito

4/21/2025 at 6:30:40 PM

This looks quite nice. I would like to see the system prompt and inference parameters exposed in the UI, because those are things I'm used to fiddling with in other UIs. Is that something that the llm library supports?

by Gracana

4/21/2025 at 6:45:08 PM

Yeah absolutely, I've just got to point where I'm happy with the architecture so I'll continue to add UI. I've just added support for fragments and I've thought to add them as if they were attached documents. I've in the radar to switch models in mid conversation and perhaps the ability to rollback a conversation or remove some messages. But yeah, system prompt and parameters would be nice to move too! Thanks for the suggestions!

by icarito

4/21/2025 at 7:27:40 PM

Awesome. It would be great to see a nice gtk-based open source competitor to lm-studio and the like.

by Gracana

4/21/2025 at 7:58:46 PM

Does this work on Mac or Linux only?

by indigodaddy

4/21/2025 at 8:06:19 PM

I'd truly like to know! But I've no access to a Mac to try. If you can, try it and let me know? If it does, please send a screenshot!

by icarito

4/22/2025 at 8:36:35 PM

Confirm it works with mac!

gtk-chat at least, having some issues with the notif lib for gtk-applet

screenshot: https://postimg.cc/KKxQNdG6

by tough